moderated Re: amaizing discovery with my iPhone and jaws!


Karina Castro
 

Hello Josh, I sincerely support everything you've said. I think if JAWS becomes what you suggest, we might even be able to edit videos, albeit very superficially. It is logical that light filters could not be used, or different perspectives of the scenes, but acceptable videos could be made, for example, cutting and pasting several short videos to make larger ones, or the same, taking a big video and start splitting it to make video summaries.

Greetings!

El 24/5/2022 a las 07:43, Josh Kennedy escribió:
hi
I just made an amaizing discovery with my iPhone and jaws. first I went to youtube.com and pulled up and started to watch a japanese anime episode with english subtitles. I then used jaws to put youtube into full screen mode. I then loaded up superSense app on my iPhone and put it into instant reader or quick read or constant reed, live Reader mode. pointed the phone at the screen and held it as steady as possible and my iPhone was reading the subtitles to me! it was so cool that my iPhone was doing something that jaws can't do! My phone was reading text directly drawn to the screen, as it appeared, in realtime, as the video played! I really really wish jaws had a virtual camera mode that is also further customizeable with jaws scripts and frames manager so jaws could do what my iPhone was doing, but take it even further. let the jaws virtual camera recognise buttons and other on-screen controls, highlight bars, graphics and stuff, and give the user access to such items using the keyboard. And you could even use the jaws virtual camera to scan entire inaccessible amazon kindle books and turn them into text or scan directly to word. NVDA has an outdated addon that tries to read live text but fails pretty bad at it. I think jaws could do better with the jaws virtual camera. Why do I need my iPhone to read live video subtitles? Amaizing that seeing-A-I and super sense can do what jaws cannot give me access to stuff that jaws can't even see. Because jaws does not have a virtual camera feature. If jaws had a scriptable virtual camera and that virtual camera could also be further customized with both script manager and frames manager then the jaws screen reader would be doing something NVDA and narrator don't do and would put jaws lightyears ahead of the other windows screen readers.   But until jaws has a virtual camera, I guess I have to use a real physical iPhone camera in order to read live video subtitles. 
-- 
Salu2, Kari

Join main@jfw.groups.io to automatically receive all group messages.