Moderated amaizing discovery with my iPhone and jaws!
I just made an amaizing discovery with my iPhone and jaws. first I went to youtube.com and pulled up and started to watch a japanese anime episode with english subtitles. I then used jaws to put youtube into full screen mode. I then loaded up superSense app on my iPhone and put it into instant reader or quick read or constant reed, live Reader mode. pointed the phone at the screen and held it as steady as possible and my iPhone was reading the subtitles to me! it was so cool that my iPhone was doing something that jaws can't do! My phone was reading text directly drawn to the screen, as it appeared, in realtime, as the video played! I really really wish jaws had a virtual camera mode that is also further customizeable with jaws scripts and frames manager so jaws could do what my iPhone was doing, but take it even further. let the jaws virtual camera recognise buttons and other on-screen controls, highlight bars, graphics and stuff, and give the user access to such items using the keyboard. And you could even use the jaws virtual camera to scan entire inaccessible amazon kindle books and turn them into text or scan directly to word. NVDA has an outdated addon that tries to read live text but fails pretty bad at it. I think jaws could do better with the jaws virtual camera. Why do I need my iPhone to read live video subtitles? Amaizing that seeing-A-I and super sense can do what jaws cannot give me access to stuff that jaws can't even see. Because jaws does not have a virtual camera feature. If jaws had a scriptable virtual camera and that virtual camera could also be further customized with both script manager and frames manager then the jaws screen reader would be doing something NVDA and narrator don't do and would put jaws lightyears ahead of the other windows screen readers. But until jaws has a virtual camera, I guess I have to use a real physical iPhone camera in order to read live video subtitles.
I just made an amaizing discovery with my iPhone and jaws. first I went to youtube.com and pulled up and started to watch a japanese anime episode with english subtitles. I then used jaws to put youtube into full screen mode. I then loaded up superSense app on my iPhone and put it into instant reader or quick read or constant reed, live Reader mode. pointed the phone at the screen and held it as steady as possible and my iPhone was reading the subtitles to me! it was so cool that my iPhone was doing something that jaws can't do! My phone was reading text directly drawn to the screen, as it appeared, in realtime, as the video played! I really really wish jaws had a virtual camera mode that is also further customizeable with jaws scripts and frames manager so jaws could do what my iPhone was doing, but take it even further. let the jaws virtual camera recognise buttons and other on-screen controls, highlight bars, graphics and stuff, and give the user access to such items using the keyboard. And you could even use the jaws virtual camera to scan entire inaccessible amazon kindle books and turn them into text or scan directly to word. NVDA has an outdated addon that tries to read live text but fails pretty bad at it. I think jaws could do better with the jaws virtual camera. Why do I need my iPhone to read live video subtitles? Amaizing that seeing-A-I and super sense can do what jaws cannot give me access to stuff that jaws can't even see. Because jaws does not have a virtual camera feature. If jaws had a scriptable virtual camera and that virtual camera could also be further customized with both script manager and frames manager then the jaws screen reader would be doing something NVDA and narrator don't do and would put jaws lightyears ahead of the other windows screen readers. But until jaws has a virtual camera, I guess I have to use a real physical iPhone camera in order to read live video subtitles.
Hi Josh,
I am sure that you have already done this, but in case you haven’t, please send these observations to Freedom Scientifvic or whatever they call themselves at the moment (smile),. What a wonderful thing you describe here!
Best,
Terrie Terlau
Sent: Tuesday, May 24, 2022 9:43 AM
To: main@jfw.groups.io
Subject: amaizing discovery with my iPhone and jaws!
hi
I just made an amaizing discovery with my iPhone and jaws. first I went to youtube.com and pulled up and started to watch a japanese anime episode with english subtitles. I then used jaws to put youtube into full screen mode. I then loaded up superSense app on my iPhone and put it into instant reader or quick read or constant reed, live Reader mode. pointed the phone at the screen and held it as steady as possible and my iPhone was reading the subtitles to me! it was so cool that my iPhone was doing something that jaws can't do! My phone was reading text directly drawn to the screen, as it appeared, in realtime, as the video played! I really really wish jaws had a virtual camera mode that is also further customizeable with jaws scripts and frames manager so jaws could do what my iPhone was doing, but take it even further. let the jaws virtual camera recognise buttons and other on-screen controls, highlight bars, graphics and stuff, and give the user access to such items using the keyboard. And you could even use the jaws virtual camera to scan entire inaccessible amazon kindle books and turn them into text or scan directly to word. NVDA has an outdated addon that tries to read live text but fails pretty bad at it. I think jaws could do better with the jaws virtual camera. Why do I need my iPhone to read live video subtitles? Amaizing that seeing-A-I and super sense can do what jaws cannot give me access to stuff that jaws can't even see. Because jaws does not have a virtual camera feature. If jaws had a scriptable virtual camera and that virtual camera could also be further customized with both script manager and frames manager then the jaws screen reader would be doing something NVDA and narrator don't do and would put jaws lightyears ahead of the other windows screen readers. But until jaws has a virtual camera, I guess I have to use a real physical iPhone camera in order to read live video subtitles.
Visparrow
Not sure why a script cannot be written to accomplish this, minus the klunkyness of a separate device
Sent: Tuesday, May 24, 2022 8:10 AM
To: main@jfw.groups.io
Subject: Re: amaizing discovery with my iPhone and jaws!
Hi Josh,
I am sure that you have already done this, but in case you haven’t, please send these observations to Freedom Scientifvic or whatever they call themselves at the moment (smile),. What a wonderful thing you describe here!
Best,
Terrie Terlau
From: main@jfw.groups.io <main@jfw.groups.io> On Behalf Of Josh Kennedy
Sent: Tuesday, May 24, 2022 9:43 AM
To: main@jfw.groups.io
Subject: amaizing discovery with my iPhone and jaws!
hi
I just made an amaizing discovery with my iPhone and jaws. first I went to youtube.com and pulled up and started to watch a japanese anime episode with english subtitles. I then used jaws to put youtube into full screen mode. I then loaded up superSense app on my iPhone and put it into instant reader or quick read or constant reed, live Reader mode. pointed the phone at the screen and held it as steady as possible and my iPhone was reading the subtitles to me! it was so cool that my iPhone was doing something that jaws can't do! My phone was reading text directly drawn to the screen, as it appeared, in realtime, as the video played! I really really wish jaws had a virtual camera mode that is also further customizeable with jaws scripts and frames manager so jaws could do what my iPhone was doing, but take it even further. let the jaws virtual camera recognise buttons and other on-screen controls, highlight bars, graphics and stuff, and give the user access to such items using the keyboard. And you could even use the jaws virtual camera to scan entire inaccessible amazon kindle books and turn them into text or scan directly to word. NVDA has an outdated addon that tries to read live text but fails pretty bad at it. I think jaws could do better with the jaws virtual camera. Why do I need my iPhone to read live video subtitles? Amaizing that seeing-A-I and super sense can do what jaws cannot give me access to stuff that jaws can't even see. Because jaws does not have a virtual camera feature. If jaws had a scriptable virtual camera and that virtual camera could also be further customized with both script manager and frames manager then the jaws screen reader would be doing something NVDA and narrator don't do and would put jaws lightyears ahead of the other windows screen readers. But until jaws has a virtual camera, I guess I have to use a real physical iPhone camera in order to read live video subtitles.
Hi Josh,
I am sure that you have already done this, but in case you haven’t, please send these observations to Freedom Scientifvic or whatever they call themselves at the moment (smile),. What a wonderful thing you describe here!
Best,
Terrie Terlau
From: main@jfw.groups.io <main@jfw.groups.io> On Behalf Of
Josh Kennedy
Sent: Tuesday, May 24, 2022 9:43 AM
To:
main@jfw.groups.io
Subject: amaizing discovery with my iPhone and
jaws!
hi
I just made an amaizing discovery with my iPhone and
jaws. first I went to youtube.com and pulled up and started to watch a japanese
anime episode with english subtitles. I then used jaws to put youtube into full
screen mode. I then loaded up superSense app on my iPhone and put it into
instant reader or quick read or constant reed, live Reader mode. pointed the
phone at the screen and held it as steady as possible and my iPhone was reading
the subtitles to me! it was so cool that my iPhone was doing something that jaws
can't do! My phone was reading text directly drawn to the screen, as it
appeared, in realtime, as the video played! I really really wish jaws had a
virtual camera mode that is also further customizeable with jaws scripts and
frames manager so jaws could do what my iPhone was doing, but take it even
further. let the jaws virtual camera recognise buttons and other on-screen
controls, highlight bars, graphics and stuff, and give the user access to such
items using the keyboard. And you could even use the jaws virtual camera to scan
entire inaccessible amazon kindle books and turn them into text or scan directly
to word. NVDA has an outdated addon that tries to read live text but fails
pretty bad at it. I think jaws could do better with the jaws virtual camera. Why
do I need my iPhone to read live video subtitles? Amaizing that seeing-A-I and
super sense can do what jaws cannot give me access to stuff that jaws can't even
see. Because jaws does not have a virtual camera feature. If jaws had a
scriptable virtual camera and that virtual camera could also be further
customized with both script manager and frames manager then the jaws screen
reader would be doing something NVDA and narrator don't do and would put jaws
lightyears ahead of the other windows screen readers. But until jaws
has a virtual camera, I guess I have to use a real physical iPhone camera in
order to read live video subtitles.
Hello Josh, I sincerely support everything you've said. I think
if JAWS becomes what you suggest, we might even be able to edit
videos, albeit very superficially. It is logical that light
filters could not be used, or different perspectives of the
scenes, but acceptable videos could be made, for example, cutting
and pasting several short videos to make larger ones, or the same,
taking a big video and start splitting it to make video summaries.
Greetings!
hi
I just made an amaizing discovery with my iPhone and jaws. first I went to youtube.com and pulled up and started to watch a japanese anime episode with english subtitles. I then used jaws to put youtube into full screen mode. I then loaded up superSense app on my iPhone and put it into instant reader or quick read or constant reed, live Reader mode. pointed the phone at the screen and held it as steady as possible and my iPhone was reading the subtitles to me! it was so cool that my iPhone was doing something that jaws can't do! My phone was reading text directly drawn to the screen, as it appeared, in realtime, as the video played! I really really wish jaws had a virtual camera mode that is also further customizeable with jaws scripts and frames manager so jaws could do what my iPhone was doing, but take it even further. let the jaws virtual camera recognise buttons and other on-screen controls, highlight bars, graphics and stuff, and give the user access to such items using the keyboard. And you could even use the jaws virtual camera to scan entire inaccessible amazon kindle books and turn them into text or scan directly to word. NVDA has an outdated addon that tries to read live text but fails pretty bad at it. I think jaws could do better with the jaws virtual camera. Why do I need my iPhone to read live video subtitles? Amaizing that seeing-A-I and super sense can do what jaws cannot give me access to stuff that jaws can't even see. Because jaws does not have a virtual camera feature. If jaws had a scriptable virtual camera and that virtual camera could also be further customized with both script manager and frames manager then the jaws screen reader would be doing something NVDA and narrator don't do and would put jaws lightyears ahead of the other windows screen readers. But until jaws has a virtual camera, I guess I have to use a real physical iPhone camera in order to read live video subtitles.
-- Salu2, Kari
transcripts of its videos.
I posted a question, here on the list, a couple of months back concerning
a news item that David Goldfield posted on its tech news list.
Here is the text I posted on this list at that time asking as to how this
might be used with JFW. Unfortunately, I got no responses back in March;
perhaps now, someone can come up with the keystrokes to the following
question I asked at that time
------
Overnight, David Goldfield, on his tech-vi@groups.io list forwarded an
item concerning
"How to Get the Transcript of a YouTube Video.
I am wondering if JFW2022 can do this.
If so, can someone translate this into JAWS steps or would a JAWS script
be needed to successfully accomplish this::
From the article, here are the instructions for a PC:
-----
On a desktop or laptop, head on over to YouTube.com in a web browser
such as Google Chrome and open a video to watch.
Select a video.
Next, click the three-dot menu icon underneath the title of the
video.
Select ÿÿShow Transcriptÿÿ from the menu.
Select "Show Transcript."
The transcript box will open and youÿÿll see the captions listed
along with timestamps. You can click a caption to jump to that part
of the video.
Select caption to jump to timestamp.
Wonder if this can be done on a pc, and, if so, how?
Josh,
What a wonderful discovery. I’ve never been one who was interested in movies or videos in other languages because I can’t read the translation on the screen. But I may give your discovery a try. Thanks for sharing and for thinking outside the box.
Lori Lynn
Sent: Tuesday, May 24, 2022 8:43 AM
To: main@jfw.groups.io
Subject: amaizing discovery with my iPhone and jaws!
hi
I just made an amaizing discovery with my iPhone and jaws. first I went to youtube.com and pulled up and started to watch a japanese anime episode with english subtitles. I then used jaws to put youtube into full screen mode. I then loaded up superSense app
on my iPhone and put it into instant reader or quick read or constant reed, live Reader mode. pointed the phone at the screen and held it as steady as possible and my iPhone was reading the subtitles to me! it was so cool that my iPhone was doing something
that jaws can't do! My phone was reading text directly drawn to the screen, as it appeared, in realtime, as the video played! I really really wish jaws had a virtual camera mode that is also further customizeable with jaws scripts and frames manager so jaws
could do what my iPhone was doing, but take it even further. let the jaws virtual camera recognise buttons and other on-screen controls, highlight bars, graphics and stuff, and give the user access to such items using the keyboard. And you could even use the
jaws virtual camera to scan entire inaccessible amazon kindle books and turn them into text or scan directly to word. NVDA has an outdated addon that tries to read live text but fails pretty bad at it. I think jaws could do better with the jaws virtual camera.
Why do I need my iPhone to read live video subtitles? Amaizing that seeing-A-I and super sense can do what jaws cannot give me access to stuff that jaws can't even see. Because jaws does not have a virtual camera feature. If jaws had a scriptable virtual camera
and that virtual camera could also be further customized with both script manager and frames manager then the jaws screen reader would be doing something NVDA and narrator don't do and would put jaws lightyears ahead of the other windows screen readers. But
until jaws has a virtual camera, I guess I have to use a real physical iPhone camera in order to read live video subtitles.
I just made an amaizing discovery with my iPhone and jaws. first I went to youtube.com and pulled up and started to watch a japanese anime episode with english subtitles. I then used jaws to put youtube into full screen mode. I then loaded up superSense app on my iPhone and put it into instant reader or quick read or constant reed, live Reader mode. pointed the phone at the screen and held it as steady as possible and my iPhone was reading the subtitles to me! it was so cool that my iPhone was doing something that jaws can't do! My phone was reading text directly drawn to the screen, as it appeared, in realtime, as the video played! I really really wish jaws had a virtual camera mode that is also further customizeable with jaws scripts and frames manager so jaws could do what my iPhone was doing, but take it even further. let the jaws virtual camera recognise buttons and other on-screen controls, highlight bars, graphics and stuff, and give the user access to such items using the keyboard. And you could even use the jaws virtual camera to scan entire inaccessible amazon kindle books and turn them into text or scan directly to word. NVDA has an outdated addon that tries to read live text but fails pretty bad at it. I think jaws could do better with the jaws virtual camera. Why do I need my iPhone to read live video subtitles? Amaizing that seeing-A-I and super sense can do what jaws cannot give me access to stuff that jaws can't even see. Because jaws does not have a virtual camera feature. If jaws had a scriptable virtual camera and that virtual camera could also be further customized with both script manager and frames manager then the jaws screen reader would be doing something NVDA and narrator don't do and would put jaws lightyears ahead of the other windows screen readers. But until jaws has a virtual camera, I guess I have to use a real physical iPhone camera in order to read live video subtitles.
https://www.animemangastudies.com/resources/classes/
With 20 years of this, you’d think the schools would have addressed this
From: main@jfw.groups.io <> On Behalf Of Josh Kennedy
Sent: Wednesday, May 25, 2022 7:24 AM
To: main@jfw.groups.io
Subject: Re: amaizing discovery with my iPhone and jaws!
here is a good case for the jaws virtual camera feature. If jaws virtual camera were implemented, then I could take college classes such as the ones found at the below website. But without the jaws virtual camera it would be difficult if not impossible for me to take such courses.
https://www.animemangastudies.com/resources/classes/
This is pretty amazing, in my opinion.
I could see this being useful possibly in things like Zoom or Teams, when work chats are taking place as well, to read out the text being typed by other users.
Sometimes, I purposely mute Jaws, because there is so much it’s reading during these things.
Claudia
Sent: Tuesday, May 24, 2022 10:27 AM
To: main@jfw.groups.io
Subject: Re: amaizing discovery with my iPhone and jaws!
Josh,
What a wonderful discovery. I’ve never been one who was interested in movies or videos in other languages because I can’t read the translation on the screen. But I may give your discovery a try. Thanks for sharing and for thinking outside the box.
Lori Lynn
From: main@jfw.groups.io <main@jfw.groups.io> On Behalf Of Josh Kennedy
Sent: Tuesday, May 24, 2022 8:43 AM
To: main@jfw.groups.io
Subject: amaizing discovery with my iPhone and jaws!
hi
I just made an amaizing discovery with my iPhone and jaws. first I went to youtube.com and pulled up and started to watch a japanese anime episode with english subtitles. I then used jaws to put youtube into full screen mode. I then loaded up superSense app on my iPhone and put it into instant reader or quick read or constant reed, live Reader mode. pointed the phone at the screen and held it as steady as possible and my iPhone was reading the subtitles to me! it was so cool that my iPhone was doing something that jaws can't do! My phone was reading text directly drawn to the screen, as it appeared, in realtime, as the video played! I really really wish jaws had a virtual camera mode that is also further customizeable with jaws scripts and frames manager so jaws could do what my iPhone was doing, but take it even further. let the jaws virtual camera recognise buttons and other on-screen controls, highlight bars, graphics and stuff, and give the user access to such items using the keyboard. And you could even use the jaws virtual camera to scan entire inaccessible amazon kindle books and turn them into text or scan directly to word. NVDA has an outdated addon that tries to read live text but fails pretty bad at it. I think jaws could do better with the jaws virtual camera. Why do I need my iPhone to read live video subtitles? Amaizing that seeing-A-I and super sense can do what jaws cannot give me access to stuff that jaws can't even see. Because jaws does not have a virtual camera feature. If jaws had a scriptable virtual camera and that virtual camera could also be further customized with both script manager and frames manager then the jaws screen reader would be doing something NVDA and narrator don't do and would put jaws lightyears ahead of the other windows screen readers. But until jaws has a virtual camera, I guess I have to use a real physical iPhone camera in order to read live video subtitles.
that's what I would do. That in it'self would be such a good
feature.
Hi Josh,
I am sure that you have already done this, but in case you haven’t, please send these observations to Freedom Scientifvic or whatever they call themselves at the moment (smile),. What a wonderful thing you describe here!
Best,
Terrie Terlau
From: main@jfw.groups.io <main@jfw.groups.io> On Behalf Of Josh Kennedy
Sent: Tuesday, May 24, 2022 9:43 AM
To: main@jfw.groups.io
Subject: amaizing discovery with my iPhone and jaws!
hi
I just made an amaizing discovery with my iPhone and jaws. first I went to youtube.com and pulled up and started to watch a japanese anime episode with english subtitles. I then used jaws to put youtube into full screen mode. I then loaded up superSense app on my iPhone and put it into instant reader or quick read or constant reed, live Reader mode. pointed the phone at the screen and held it as steady as possible and my iPhone was reading the subtitles to me! it was so cool that my iPhone was doing something that jaws can't do! My phone was reading text directly drawn to the screen, as it appeared, in realtime, as the video played! I really really wish jaws had a virtual camera mode that is also further customizeable with jaws scripts and frames manager so jaws could do what my iPhone was doing, but take it even further. let the jaws virtual camera recognise buttons and other on-screen controls, highlight bars, graphics and stuff, and give the user access to such items using the keyboard. And you could even use the jaws virtual camera to scan entire inaccessible amazon kindle books and turn them into text or scan directly to word. NVDA has an outdated addon that tries to read live text but fails pretty bad at it. I think jaws could do better with the jaws virtual camera. Why do I need my iPhone to read live video subtitles? Amaizing that seeing-A-I and super sense can do what jaws cannot give me access to stuff that jaws can't even see. Because jaws does not have a virtual camera feature. If jaws had a scriptable virtual camera and that virtual camera could also be further customized with both script manager and frames manager then the jaws screen reader would be doing something NVDA and narrator don't do and would put jaws lightyears ahead of the other windows screen readers. But until jaws has a virtual camera, I guess I have to use a real physical iPhone camera in order to read live video subtitles.
On May 24, 2022, at 9:56 AM, Josh Kennedy <joshknnd1982@...> wrote:in other words, take the freedom scientific pearl camera, bake it into jaws as a software camera--keyboard accessible camera then put it on steroid's letting it recognise various on-screen controls and let you activate such controls by directly moving and clicking the mouse or giving hints on accessing such controls maybe with an xbox controller if needed.