Date   

moderated Re: amaizing discovery with my iPhone and jaws!

Josh Kennedy
 

I was thinking about the brightness issue in videos. What if the jaws virtual camera could indicate brightness by first adding a brightness toggle, and when in the virtual camera layer, as you move the virtual camera around the screen and zoom it in or out as needed, and you turn on indicate brightness with its toggle command,  jaws will play a constant tone or sound, the brighter the area, the higher the pitch of the sound you hear from jaws. you could customise it further, you could say, indicate brightness by volume or loudness of sound, indicate brightness with rapid clicks, the more rapid the clicks the brighter the area. The virtual camera will do for jaws, what the software synthesizer did for tts technology back in the early to mid 90s. In the 90s you needed a separate hardware synthesizer box just so jaws could talk. and now look, no box needed, because eloquence or the synthesizers you like are run as software on your pc speaking through the sound card. Yet the camera will be sort of like zoomtext. If we can magnify a screen in realtime for visually impaired folks, jaws should be able to use a software camera kind of like seeing-A-I or super sense on steroids but its all nicely baked into jaws. 


moderated Re: amaizing discovery with my iPhone and jaws!

Josh Kennedy
 

the videos that super sense was reading to me, were videos in which youtube subtitling was unavailable because in such videos, subtitles are drawn directly to the screen in other words they are baked into the video itself as it plays, there is nothing for youtube to hook onto, no external subtitle file, nothing. super sense with my iPhone camera was looking at the screen in realtime and figuring out the text and every now and then it would take 2 or 3 passes over the text each time reading it more accurately but it did not affect my ability to enjoy the subtitled video. now if only it could take those 3 passes, figure out which was most accurate based on a OCR dictionary and only read me the most accurate selection. 


moderated Re: amaizing discovery with my iPhone and jaws!

Glenn / Lenny
 


I think it would have to be a different window, like using convenient OCR, and if you want the controls, you would have to alt tab between them.
 

----- Original Message -----
Sent: Tuesday, May 24, 2022 9:56 AM
Subject: Re: amaizing discovery with my iPhone and jaws!

in other words, take the freedom scientific pearl camera, bake it into jaws as a software camera--keyboard accessible camera then put it on steroid's letting it recognise various on-screen controls and let you activate such controls by directly moving and clicking the mouse or giving hints on accessing such controls maybe with an xbox controller if needed. 


moderated Re: amaizing discovery with my iPhone and jaws!

Josh Kennedy
 

well if jaws had a virtual camera that was zoomable and could also detect brightness with jaws describing the brightness, then yes we could use jaws virtual camera to edit videos. 


moderated Re: amaizing discovery with my iPhone and jaws!

Josh Kennedy
 

I have already sent my observations and suggestions to 
suggestions@...


moderated Re: amaizing discovery with my iPhone and jaws!

Josh Kennedy
 

in other words, take the freedom scientific pearl camera, bake it into jaws as a software camera--keyboard accessible camera then put it on steroid's letting it recognise various on-screen controls and let you activate such controls by directly moving and clicking the mouse or giving hints on accessing such controls maybe with an xbox controller if needed. 


moderated Re: amaizing discovery with my iPhone and jaws!

Mike Pietruk
 

I wonder if this is, in any wayy related, to Youtube's ability to generate
transcripts of its videos.

I posted a question, here on the list, a couple of months back concerning
a news item that David Goldfield posted on its tech news list.

Here is the text I posted on this list at that time asking as to how this
might be used with JFW. Unfortunately, I got no responses back in March;
perhaps now, someone can come up with the keystrokes to the following
question I asked at that time

------
Overnight, David Goldfield, on his tech-vi@groups.io list forwarded an
item concerning
"How to Get the Transcript of a YouTube Video.

I am wondering if JFW2022 can do this.
If so, can someone translate this into JAWS steps or would a JAWS script
be needed to successfully accomplish this::

From the article, here are the instructions for a PC:
-----
On a desktop or laptop, head on over to YouTube.com in a web browser
such as Google Chrome and open a video to watch.

Select a video.

Next, click the three-dot menu icon underneath the title of the
video.

Select ÿÿShow Transcriptÿÿ from the menu.

Select "Show Transcript."

The transcript box will open and youÿÿll see the captions listed
along with timestamps. You can click a caption to jump to that part
of the video.

Select caption to jump to timestamp.





Wonder if this can be done on a pc, and, if so, how?


moderated Re: amaizing discovery with my iPhone and jaws!

Karina Castro
 

Hello Josh, I sincerely support everything you've said. I think if JAWS becomes what you suggest, we might even be able to edit videos, albeit very superficially. It is logical that light filters could not be used, or different perspectives of the scenes, but acceptable videos could be made, for example, cutting and pasting several short videos to make larger ones, or the same, taking a big video and start splitting it to make video summaries.

Greetings!

El 24/5/2022 a las 07:43, Josh Kennedy escribió:
hi
I just made an amaizing discovery with my iPhone and jaws. first I went to youtube.com and pulled up and started to watch a japanese anime episode with english subtitles. I then used jaws to put youtube into full screen mode. I then loaded up superSense app on my iPhone and put it into instant reader or quick read or constant reed, live Reader mode. pointed the phone at the screen and held it as steady as possible and my iPhone was reading the subtitles to me! it was so cool that my iPhone was doing something that jaws can't do! My phone was reading text directly drawn to the screen, as it appeared, in realtime, as the video played! I really really wish jaws had a virtual camera mode that is also further customizeable with jaws scripts and frames manager so jaws could do what my iPhone was doing, but take it even further. let the jaws virtual camera recognise buttons and other on-screen controls, highlight bars, graphics and stuff, and give the user access to such items using the keyboard. And you could even use the jaws virtual camera to scan entire inaccessible amazon kindle books and turn them into text or scan directly to word. NVDA has an outdated addon that tries to read live text but fails pretty bad at it. I think jaws could do better with the jaws virtual camera. Why do I need my iPhone to read live video subtitles? Amaizing that seeing-A-I and super sense can do what jaws cannot give me access to stuff that jaws can't even see. Because jaws does not have a virtual camera feature. If jaws had a scriptable virtual camera and that virtual camera could also be further customized with both script manager and frames manager then the jaws screen reader would be doing something NVDA and narrator don't do and would put jaws lightyears ahead of the other windows screen readers.   But until jaws has a virtual camera, I guess I have to use a real physical iPhone camera in order to read live video subtitles. 
-- 
Salu2, Kari


moderated Re: amaizing discovery with my iPhone and jaws!

Glenn / Lenny
 


I wonder if there is an Android app for this to run on smart TVs.
Smart TVs have Android inside.
Glenn
 

----- Original Message -----
Sent: Tuesday, May 24, 2022 9:09 AM
Subject: Re: amaizing discovery with my iPhone and jaws!

Hi Josh,

I am sure that you have already done this, but in case you haven’t, please send these observations to Freedom Scientifvic or whatever they call themselves at the moment (smile),. What a wonderful thing you describe here!

Best,

Terrie Terlau

 

From: main@jfw.groups.io <main@jfw.groups.io> On Behalf Of Josh Kennedy
Sent: Tuesday, May 24, 2022 9:43 AM
To: main@jfw.groups.io
Subject: amaizing discovery with my iPhone and jaws!

 

hi
I just made an amaizing discovery with my iPhone and jaws. first I went to youtube.com and pulled up and started to watch a japanese anime episode with english subtitles. I then used jaws to put youtube into full screen mode. I then loaded up superSense app on my iPhone and put it into instant reader or quick read or constant reed, live Reader mode. pointed the phone at the screen and held it as steady as possible and my iPhone was reading the subtitles to me! it was so cool that my iPhone was doing something that jaws can't do! My phone was reading text directly drawn to the screen, as it appeared, in realtime, as the video played! I really really wish jaws had a virtual camera mode that is also further customizeable with jaws scripts and frames manager so jaws could do what my iPhone was doing, but take it even further. let the jaws virtual camera recognise buttons and other on-screen controls, highlight bars, graphics and stuff, and give the user access to such items using the keyboard. And you could even use the jaws virtual camera to scan entire inaccessible amazon kindle books and turn them into text or scan directly to word. NVDA has an outdated addon that tries to read live text but fails pretty bad at it. I think jaws could do better with the jaws virtual camera. Why do I need my iPhone to read live video subtitles? Amaizing that seeing-A-I and super sense can do what jaws cannot give me access to stuff that jaws can't even see. Because jaws does not have a virtual camera feature. If jaws had a scriptable virtual camera and that virtual camera could also be further customized with both script manager and frames manager then the jaws screen reader would be doing something NVDA and narrator don't do and would put jaws lightyears ahead of the other windows screen readers.   But until jaws has a virtual camera, I guess I have to use a real physical iPhone camera in order to read live video subtitles. 


moderated Re: amaizing discovery with my iPhone and jaws!

Jeff Christiansen
 

Visparrow

Not sure why a script cannot be written to accomplish this, minus the klunkyness of a separate device

 

From: main@jfw.groups.io <> On Behalf Of Terrie Terlau
Sent: Tuesday, May 24, 2022 8:10 AM
To: main@jfw.groups.io
Subject: Re: amaizing discovery with my iPhone and jaws!

 

Hi Josh,

I am sure that you have already done this, but in case you haven’t, please send these observations to Freedom Scientifvic or whatever they call themselves at the moment (smile),. What a wonderful thing you describe here!

Best,

Terrie Terlau

 

From: main@jfw.groups.io <main@jfw.groups.io> On Behalf Of Josh Kennedy
Sent: Tuesday, May 24, 2022 9:43 AM
To: main@jfw.groups.io
Subject: amaizing discovery with my iPhone and jaws!

 

hi
I just made an amaizing discovery with my iPhone and jaws. first I went to youtube.com and pulled up and started to watch a japanese anime episode with english subtitles. I then used jaws to put youtube into full screen mode. I then loaded up superSense app on my iPhone and put it into instant reader or quick read or constant reed, live Reader mode. pointed the phone at the screen and held it as steady as possible and my iPhone was reading the subtitles to me! it was so cool that my iPhone was doing something that jaws can't do! My phone was reading text directly drawn to the screen, as it appeared, in realtime, as the video played! I really really wish jaws had a virtual camera mode that is also further customizeable with jaws scripts and frames manager so jaws could do what my iPhone was doing, but take it even further. let the jaws virtual camera recognise buttons and other on-screen controls, highlight bars, graphics and stuff, and give the user access to such items using the keyboard. And you could even use the jaws virtual camera to scan entire inaccessible amazon kindle books and turn them into text or scan directly to word. NVDA has an outdated addon that tries to read live text but fails pretty bad at it. I think jaws could do better with the jaws virtual camera. Why do I need my iPhone to read live video subtitles? Amaizing that seeing-A-I and super sense can do what jaws cannot give me access to stuff that jaws can't even see. Because jaws does not have a virtual camera feature. If jaws had a scriptable virtual camera and that virtual camera could also be further customized with both script manager and frames manager then the jaws screen reader would be doing something NVDA and narrator don't do and would put jaws lightyears ahead of the other windows screen readers.   But until jaws has a virtual camera, I guess I have to use a real physical iPhone camera in order to read live video subtitles. 


moderated Re: amaizing discovery with my iPhone and jaws!

Terrie Terlau
 

Hi Josh,

I am sure that you have already done this, but in case you haven’t, please send these observations to Freedom Scientifvic or whatever they call themselves at the moment (smile),. What a wonderful thing you describe here!

Best,

Terrie Terlau

 

From: main@jfw.groups.io <main@jfw.groups.io> On Behalf Of Josh Kennedy
Sent: Tuesday, May 24, 2022 9:43 AM
To: main@jfw.groups.io
Subject: amaizing discovery with my iPhone and jaws!

 

hi
I just made an amaizing discovery with my iPhone and jaws. first I went to youtube.com and pulled up and started to watch a japanese anime episode with english subtitles. I then used jaws to put youtube into full screen mode. I then loaded up superSense app on my iPhone and put it into instant reader or quick read or constant reed, live Reader mode. pointed the phone at the screen and held it as steady as possible and my iPhone was reading the subtitles to me! it was so cool that my iPhone was doing something that jaws can't do! My phone was reading text directly drawn to the screen, as it appeared, in realtime, as the video played! I really really wish jaws had a virtual camera mode that is also further customizeable with jaws scripts and frames manager so jaws could do what my iPhone was doing, but take it even further. let the jaws virtual camera recognise buttons and other on-screen controls, highlight bars, graphics and stuff, and give the user access to such items using the keyboard. And you could even use the jaws virtual camera to scan entire inaccessible amazon kindle books and turn them into text or scan directly to word. NVDA has an outdated addon that tries to read live text but fails pretty bad at it. I think jaws could do better with the jaws virtual camera. Why do I need my iPhone to read live video subtitles? Amaizing that seeing-A-I and super sense can do what jaws cannot give me access to stuff that jaws can't even see. Because jaws does not have a virtual camera feature. If jaws had a scriptable virtual camera and that virtual camera could also be further customized with both script manager and frames manager then the jaws screen reader would be doing something NVDA and narrator don't do and would put jaws lightyears ahead of the other windows screen readers.   But until jaws has a virtual camera, I guess I have to use a real physical iPhone camera in order to read live video subtitles. 


moderated Re: amaizing discovery with my iPhone and jaws!

Joseph Machise
 


grate.

----- Original Message -----
Sent: Tuesday, May 24, 2022 9:43 AM
Subject: amaizing discovery with my iPhone and jaws!

hi
I just made an amaizing discovery with my iPhone and jaws. first I went to youtube.com and pulled up and started to watch a japanese anime episode with english subtitles. I then used jaws to put youtube into full screen mode. I then loaded up superSense app on my iPhone and put it into instant reader or quick read or constant reed, live Reader mode. pointed the phone at the screen and held it as steady as possible and my iPhone was reading the subtitles to me! it was so cool that my iPhone was doing something that jaws can't do! My phone was reading text directly drawn to the screen, as it appeared, in realtime, as the video played! I really really wish jaws had a virtual camera mode that is also further customizeable with jaws scripts and frames manager so jaws could do what my iPhone was doing, but take it even further. let the jaws virtual camera recognise buttons and other on-screen controls, highlight bars, graphics and stuff, and give the user access to such items using the keyboard. And you could even use the jaws virtual camera to scan entire inaccessible amazon kindle books and turn them into text or scan directly to word. NVDA has an outdated addon that tries to read live text but fails pretty bad at it. I think jaws could do better with the jaws virtual camera. Why do I need my iPhone to read live video subtitles? Amaizing that seeing-A-I and super sense can do what jaws cannot give me access to stuff that jaws can't even see. Because jaws does not have a virtual camera feature. If jaws had a scriptable virtual camera and that virtual camera could also be further customized with both script manager and frames manager then the jaws screen reader would be doing something NVDA and narrator don't do and would put jaws lightyears ahead of the other windows screen readers.   But until jaws has a virtual camera, I guess I have to use a real physical iPhone camera in order to read live video subtitles. 


moderated amaizing discovery with my iPhone and jaws!

Josh Kennedy
 

hi
I just made an amaizing discovery with my iPhone and jaws. first I went to youtube.com and pulled up and started to watch a japanese anime episode with english subtitles. I then used jaws to put youtube into full screen mode. I then loaded up superSense app on my iPhone and put it into instant reader or quick read or constant reed, live Reader mode. pointed the phone at the screen and held it as steady as possible and my iPhone was reading the subtitles to me! it was so cool that my iPhone was doing something that jaws can't do! My phone was reading text directly drawn to the screen, as it appeared, in realtime, as the video played! I really really wish jaws had a virtual camera mode that is also further customizeable with jaws scripts and frames manager so jaws could do what my iPhone was doing, but take it even further. let the jaws virtual camera recognise buttons and other on-screen controls, highlight bars, graphics and stuff, and give the user access to such items using the keyboard. And you could even use the jaws virtual camera to scan entire inaccessible amazon kindle books and turn them into text or scan directly to word. NVDA has an outdated addon that tries to read live text but fails pretty bad at it. I think jaws could do better with the jaws virtual camera. Why do I need my iPhone to read live video subtitles? Amaizing that seeing-A-I and super sense can do what jaws cannot give me access to stuff that jaws can't even see. Because jaws does not have a virtual camera feature. If jaws had a scriptable virtual camera and that virtual camera could also be further customized with both script manager and frames manager then the jaws screen reader would be doing something NVDA and narrator don't do and would put jaws lightyears ahead of the other windows screen readers.   But until jaws has a virtual camera, I guess I have to use a real physical iPhone camera in order to read live video subtitles. 


moderated Re: Unknown Constant when Compiling User-default.jss.

Josh Kennedy
 

Do you have Leasey installed? if so that may be the problem. Leasey overwrites many user scripts with its own custom ones.


moderated Re: Moving To An In Page Link In An Email

Mark Fisher
 

Hi, thanks for the suggestions. Hes, not the gratest tagged email either - they don't use headings. I'll try the save as HTML and see how that goes.
--
Mark Fisher
Manager - People Systems
Water Corporation of Western Australia


moderated Re: Skype

Steve Nutt
 

Hi Mike,

You might want to look at these JAWS scripts for Skype here, from Doug Lee:-

https://www.dlee.org/skype/

All the best

Steve

-----Original Message-----
From: main@jfw.groups.io <main@jfw.groups.io> On Behalf Of Mike Gurney
Sent: 22 May 2022 09:58
To: main@jfw.groups.io
Subject: Skype

Hello

I have recently returned to using Jaws. I am trying to use Skype without much success. Do I need to enable anything within the Jaws settings or files?

Mike

--
Mike Gurney


moderated Unknown Constant when Compiling User-default.jss.

Dennis Brown
 

I get the Unknown Constant error when I try to compile default.jss.

The line is as follows, and is in the MSAAAlertHandler event.

Let sMsgLong = formatString (cmsgAlert_L, sText)

 

Thanks,

Dennis T. Brown

 




AVG logo

This email has been checked for viruses by AVG antivirus software.
www.avg.com



moderated Toggling Punctuation from within a script.

Dennis Brown
 

I’d like to have punctuation set to All in Notepad and Word, and then set to None when exiting.

I know I can use AutoStart and AutoFinish events, but scripts and functions I used in the past have changed a great deal.

Any suggestions?

 

 

Thanks,

Dennis T. Brown

 




AVG logo

This email has been checked for viruses by AVG antivirus software.
www.avg.com



moderated Re: Skype

Mich Verrier
 

Hi no you don't need to inable anything in jaws how ever it would help to go to www.dlee.org to get the scripts for jaws and skype. Hth. From Mich.

-----Original Message-----
From: main@jfw.groups.io <main@jfw.groups.io> On Behalf Of Mike Gurney
Sent: May 22, 2022 4:58 AM
To: main@jfw.groups.io
Subject: Skype

Hello

I have recently returned to using Jaws. I am trying to use Skype
without much success. Do I need to enable anything within the Jaws
settings or files?

Mike

--
Mike Gurney


moderated Skype

Mike Gurney
 

Hello

 I have recently returned to using Jaws.  I am trying to use Skype without much success.  Do I need to enable anything within the Jaws settings or files?

 Mike

--
Mike Gurney

2621 - 2640 of 99711