moderated amaizing discovery with my iPhone and jaws!
Josh Kennedy
hi
I just made an amaizing discovery with my iPhone and jaws. first I went to youtube.com and pulled up and started to watch a japanese anime episode with english subtitles. I then used jaws to put youtube into full screen mode. I then loaded up superSense app on my iPhone and put it into instant reader or quick read or constant reed, live Reader mode. pointed the phone at the screen and held it as steady as possible and my iPhone was reading the subtitles to me! it was so cool that my iPhone was doing something that jaws can't do! My phone was reading text directly drawn to the screen, as it appeared, in realtime, as the video played! I really really wish jaws had a virtual camera mode that is also further customizeable with jaws scripts and frames manager so jaws could do what my iPhone was doing, but take it even further. let the jaws virtual camera recognise buttons and other on-screen controls, highlight bars, graphics and stuff, and give the user access to such items using the keyboard. And you could even use the jaws virtual camera to scan entire inaccessible amazon kindle books and turn them into text or scan directly to word. NVDA has an outdated addon that tries to read live text but fails pretty bad at it. I think jaws could do better with the jaws virtual camera. Why do I need my iPhone to read live video subtitles? Amaizing that seeing-A-I and super sense can do what jaws cannot give me access to stuff that jaws can't even see. Because jaws does not have a virtual camera feature. If jaws had a scriptable virtual camera and that virtual camera could also be further customized with both script manager and frames manager then the jaws screen reader would be doing something NVDA and narrator don't do and would put jaws lightyears ahead of the other windows screen readers. But until jaws has a virtual camera, I guess I have to use a real physical iPhone camera in order to read live video subtitles.
|
|
Joseph Machise
toggle quoted messageShow quoted text
grate.
----- Original Message -----
From: Josh
Kennedy
Sent: Tuesday, May 24, 2022 9:43 AM
Subject: amaizing discovery with my iPhone and jaws! I just made an amaizing discovery with my iPhone and jaws. first I went to youtube.com and pulled up and started to watch a japanese anime episode with english subtitles. I then used jaws to put youtube into full screen mode. I then loaded up superSense app on my iPhone and put it into instant reader or quick read or constant reed, live Reader mode. pointed the phone at the screen and held it as steady as possible and my iPhone was reading the subtitles to me! it was so cool that my iPhone was doing something that jaws can't do! My phone was reading text directly drawn to the screen, as it appeared, in realtime, as the video played! I really really wish jaws had a virtual camera mode that is also further customizeable with jaws scripts and frames manager so jaws could do what my iPhone was doing, but take it even further. let the jaws virtual camera recognise buttons and other on-screen controls, highlight bars, graphics and stuff, and give the user access to such items using the keyboard. And you could even use the jaws virtual camera to scan entire inaccessible amazon kindle books and turn them into text or scan directly to word. NVDA has an outdated addon that tries to read live text but fails pretty bad at it. I think jaws could do better with the jaws virtual camera. Why do I need my iPhone to read live video subtitles? Amaizing that seeing-A-I and super sense can do what jaws cannot give me access to stuff that jaws can't even see. Because jaws does not have a virtual camera feature. If jaws had a scriptable virtual camera and that virtual camera could also be further customized with both script manager and frames manager then the jaws screen reader would be doing something NVDA and narrator don't do and would put jaws lightyears ahead of the other windows screen readers. But until jaws has a virtual camera, I guess I have to use a real physical iPhone camera in order to read live video subtitles.
|
|
Terrie Terlau
Hi Josh, I am sure that you have already done this, but in case you haven’t, please send these observations to Freedom Scientifvic or whatever they call themselves at the moment (smile),. What a wonderful thing you describe here! Best, Terrie Terlau
From: main@jfw.groups.io <main@jfw.groups.io> On Behalf Of Josh Kennedy
Sent: Tuesday, May 24, 2022 9:43 AM To: main@jfw.groups.io Subject: amaizing discovery with my iPhone and jaws!
hi
|
|
Jeff Christiansen
Visparrow Not sure why a script cannot be written to accomplish this, minus the klunkyness of a separate device
From: main@jfw.groups.io <> On Behalf Of Terrie Terlau
Sent: Tuesday, May 24, 2022 8:10 AM To: main@jfw.groups.io Subject: Re: amaizing discovery with my iPhone and jaws!
Hi Josh, I am sure that you have already done this, but in case you haven’t, please send these observations to Freedom Scientifvic or whatever they call themselves at the moment (smile),. What a wonderful thing you describe here! Best, Terrie Terlau
From: main@jfw.groups.io <main@jfw.groups.io> On Behalf Of Josh Kennedy
hi
|
|
K0LNY
toggle quoted messageShow quoted text
I wonder if there is an Android app for this to run
on smart TVs.
Smart TVs have Android inside.
Glenn
----- Original Message -----
From: Terrie Terlau
Sent: Tuesday, May 24, 2022 9:09 AM
Subject: Re: amaizing discovery with my iPhone and jaws! Hi Josh, I am sure that you have already done this, but in case you haven’t, please send these observations to Freedom Scientifvic or whatever they call themselves at the moment (smile),. What a wonderful thing you describe here! Best, Terrie Terlau
From: main@jfw.groups.io <main@jfw.groups.io> On Behalf Of
Josh Kennedy
hi
|
|
Karina Castro
Hello Josh, I sincerely support everything you've said. I think
if JAWS becomes what you suggest, we might even be able to edit
videos, albeit very superficially. It is logical that light
filters could not be used, or different perspectives of the
scenes, but acceptable videos could be made, for example, cutting
and pasting several short videos to make larger ones, or the same,
taking a big video and start splitting it to make video summaries. El 24/5/2022 a las 07:43, Josh Kennedy
escribió:
hi -- Salu2, Kari
|
|
Mike Pietruk
I wonder if this is, in any wayy related, to Youtube's ability to generate
transcripts of its videos. I posted a question, here on the list, a couple of months back concerning a news item that David Goldfield posted on its tech news list. Here is the text I posted on this list at that time asking as to how this might be used with JFW. Unfortunately, I got no responses back in March; perhaps now, someone can come up with the keystrokes to the following question I asked at that time ------ Overnight, David Goldfield, on his tech-vi@groups.io list forwarded an item concerning "How to Get the Transcript of a YouTube Video. I am wondering if JFW2022 can do this. If so, can someone translate this into JAWS steps or would a JAWS script be needed to successfully accomplish this:: From the article, here are the instructions for a PC: ----- On a desktop or laptop, head on over to YouTube.com in a web browser such as Google Chrome and open a video to watch. Select a video. Next, click the three-dot menu icon underneath the title of the video. Select ÿÿShow Transcriptÿÿ from the menu. Select "Show Transcript." The transcript box will open and youÿÿll see the captions listed along with timestamps. You can click a caption to jump to that part of the video. Select caption to jump to timestamp. Wonder if this can be done on a pc, and, if so, how?
|
|
Josh Kennedy
in other words, take the freedom scientific pearl camera, bake it into jaws as a software camera--keyboard accessible camera then put it on steroid's letting it recognise various on-screen controls and let you activate such controls by directly moving and clicking the mouse or giving hints on accessing such controls maybe with an xbox controller if needed.
|
|
Josh Kennedy
|
|
Josh Kennedy
well if jaws had a virtual camera that was zoomable and could also detect brightness with jaws describing the brightness, then yes we could use jaws virtual camera to edit videos.
|
|
K0LNY
toggle quoted messageShow quoted text
I think it would have to be a different window,
like using convenient OCR, and if you want the controls, you would have to alt
tab between them.
----- Original Message -----
From: Josh
Kennedy
Sent: Tuesday, May 24, 2022 9:56 AM
Subject: Re: amaizing discovery with my iPhone and jaws!
|
|
Josh Kennedy
the videos that super sense was reading to me, were videos in which youtube subtitling was unavailable because in such videos, subtitles are drawn directly to the screen in other words they are baked into the video itself as it plays, there is nothing for youtube to hook onto, no external subtitle file, nothing. super sense with my iPhone camera was looking at the screen in realtime and figuring out the text and every now and then it would take 2 or 3 passes over the text each time reading it more accurately but it did not affect my ability to enjoy the subtitled video. now if only it could take those 3 passes, figure out which was most accurate based on a OCR dictionary and only read me the most accurate selection.
|
|
Josh Kennedy
I was thinking about the brightness issue in videos. What if the jaws virtual camera could indicate brightness by first adding a brightness toggle, and when in the virtual camera layer, as you move the virtual camera around the screen and zoom it in or out as needed, and you turn on indicate brightness with its toggle command, jaws will play a constant tone or sound, the brighter the area, the higher the pitch of the sound you hear from jaws. you could customise it further, you could say, indicate brightness by volume or loudness of sound, indicate brightness with rapid clicks, the more rapid the clicks the brighter the area. The virtual camera will do for jaws, what the software synthesizer did for tts technology back in the early to mid 90s. In the 90s you needed a separate hardware synthesizer box just so jaws could talk. and now look, no box needed, because eloquence or the synthesizers you like are run as software on your pc speaking through the sound card. Yet the camera will be sort of like zoomtext. If we can magnify a screen in realtime for visually impaired folks, jaws should be able to use a software camera kind of like seeing-A-I or super sense on steroids but its all nicely baked into jaws.
|
|
Lori Lynn
Josh,
What a wonderful discovery. I’ve never been one who was interested in movies or videos in other languages because I can’t read the translation on the screen. But I may give your discovery a try. Thanks for sharing and for thinking outside the box.
Lori Lynn
From: main@jfw.groups.io <main@jfw.groups.io> On Behalf Of
Josh Kennedy
hi
|
|
Don Walls
If ifs and buts were candy and nuts, every day would be Christmas.
From: Josh
Kennedy
Sent: Tuesday, May 24, 2022 6:43 AM
To: main@jfw.groups.io
Subject: amaizing discovery with my iPhone and
jaws! hi I just made an amaizing discovery with my iPhone and jaws. first I went to youtube.com and pulled up and started to watch a japanese anime episode with english subtitles. I then used jaws to put youtube into full screen mode. I then loaded up superSense app on my iPhone and put it into instant reader or quick read or constant reed, live Reader mode. pointed the phone at the screen and held it as steady as possible and my iPhone was reading the subtitles to me! it was so cool that my iPhone was doing something that jaws can't do! My phone was reading text directly drawn to the screen, as it appeared, in realtime, as the video played! I really really wish jaws had a virtual camera mode that is also further customizeable with jaws scripts and frames manager so jaws could do what my iPhone was doing, but take it even further. let the jaws virtual camera recognise buttons and other on-screen controls, highlight bars, graphics and stuff, and give the user access to such items using the keyboard. And you could even use the jaws virtual camera to scan entire inaccessible amazon kindle books and turn them into text or scan directly to word. NVDA has an outdated addon that tries to read live text but fails pretty bad at it. I think jaws could do better with the jaws virtual camera. Why do I need my iPhone to read live video subtitles? Amaizing that seeing-A-I and super sense can do what jaws cannot give me access to stuff that jaws can't even see. Because jaws does not have a virtual camera feature. If jaws had a scriptable virtual camera and that virtual camera could also be further customized with both script manager and frames manager then the jaws screen reader would be doing something NVDA and narrator don't do and would put jaws lightyears ahead of the other windows screen readers. But until jaws has a virtual camera, I guess I have to use a real physical iPhone camera in order to read live video subtitles.
|
|
Josh Kennedy
here is a good case for the jaws virtual camera feature. If jaws virtual camera were implemented, then I could take college classes such as the ones found at the below website. But without the jaws virtual camera it would be difficult if not impossible for me to take such courses.
https://www.animemangastudies.com/resources/classes/
|
|
Jeff Christiansen
With 20 years of this, you’d think the schools would have addressed this
From: main@jfw.groups.io <> On Behalf Of Josh Kennedy
here is a good case for the jaws virtual camera feature. If jaws virtual camera were implemented, then I could take college classes such as the ones found at the below website. But without the jaws virtual camera it would be difficult if not impossible for me to take such courses.
|
|
Claudia
This is pretty amazing, in my opinion. I could see this being useful possibly in things like Zoom or Teams, when work chats are taking place as well, to read out the text being typed by other users. Sometimes, I purposely mute Jaws, because there is so much it’s reading during these things.
Claudia
From: main@jfw.groups.io <main@jfw.groups.io> On Behalf Of Lori Lynn
Sent: Tuesday, May 24, 2022 10:27 AM To: main@jfw.groups.io Subject: Re: amaizing discovery with my iPhone and jaws!
Josh,
What a wonderful discovery. I’ve never been one who was interested in movies or videos in other languages because I can’t read the translation on the screen. But I may give your discovery a try. Thanks for sharing and for thinking outside the box.
Lori Lynn
From: main@jfw.groups.io <main@jfw.groups.io> On Behalf Of Josh Kennedy
hi
|
|
crayton Benner <craybay3198@...>
that's what I would do. That in it'self would be such a good
feature. On 5/24/2022 7:09 AM, Terrie Terlau
wrote:
|
|
David Ingram
I’d like to know what version of jaws was used as well as the iphone that was used for this task? Thank you for any information that you might have concerning this question. Has anyone tried this with the pearl camera and jaws if so, what results did you get.
toggle quoted messageShow quoted text
|
|