I post a lot of videos, and so many now that I'll sometimes just stack them onto a post.
But this one is special. This video shows the very core of the Save Point Technology working. The big idea that I've spent years claiming will change everything.
I doubt anyone remembers, but starting years ago, I've been saying cryptic, seemingly meaningless things, such as "once I film a person walking down a road one time perfectly, I'll have a shot of every person walking down every road in every country of the world, forever"
Probably sounded like the rantings of a madman. Lol. Here's the system up and running as of today. Take any film clip of any type, such as my unreal engine control plates, and make that clip fit your script. No more budgets, no more walls, no more limits. If I have a clip of two people having a conversation at a table. I now have a clip of ANY two people sitting at ANY table having a conversation. Get a good shot of a motorcycle driving down a highway? Now that motorcycle can be driving down ANY highway, in ANY weather, heading towards ANY city at ANY time of day.
Once one of the transformations you see in the video above takes place, the scene is re-animated using the technique shown in the "Hybrid" video. The new scene will be animated as the original scene was, so no more weird AI movements. It's not automated yet, so it will take me a minute to make the first full proof video of everything working full scale, but it's going to be very soon.
There's always more work to do, faster, better, smoother, more automated, etc. But this is it. If you can fully understand this workflow I've assembled, you can make any movie you've ever imagined. It will still take a lot of work. It just won't take money any more.
(video below shows our in house solution for driving refabricated scenes from real or cgi base clips, I'm going to add one more brain to this stage, and it should retain coherence at photorealism under motion)
I should also note that previous animation videos and lipsync videos were using a temporary solution, and once I've streamlined and automated the current build, all animations should be lifelike, and line delivery natural and expressive.