Blue Steel


This video is special. Not because of it's quality or content, but because of how it was made.

It's demonstrating the first working (partially) pre alpha version of the program "Save Point Vortex Extrapolator".

I'm working on building something that requires huge amounts of work to be done for next to nothing, so for Save Point to work, I needed a few more puzzle pieces that simply weren't available anywhere.

2000 lines of python coding later, and I have a badly malfunctioning version of this final piece, lol. It works, sort of, as you can see from the video.

I'll explain this rather complex piece of software I'm writing some time when I don't need sleep. For now, here's what's important. To my knowledge, this is the first video to be edited autonomously. I've built a full auto video editor, because I was spending hours a day editing video, and had other things to do.

This video was created in 120 seconds using bespoke software, along with 200 others like it in the last week. It detects beats in music and cuts edits to the beat, humanizing the edit according to controls I set. It can search visually through footage and find things I ask for automatically, like a bass guitar, or close up shot. It can judge aesthetics like color balance, rule of thirds, sharpness, composition, etc, and pick out the best X percentage of clips according to how I set it. It can select music by genre and randomize, then auto sync the video to whatever track it or I select.

This is super useful in so many ways. In example, if I want to make a supercut of every time someone pulls a gun in the sopranos, I can just type "person holding gun" into this program, and then drop the complete series into a folder. I hit go, and the next day there's an hour long video of that set to the beat of the original soundtrack, or whatever music I want to use.

The sync of this first pre alpha video is pretty bad. Lot's of bugs in the program, it's early in development. When it's finished it will behave like a human editor, intelligently looking for opportunities to tell the story of a provided script visually. Keep in mind that this demo above is not it "working" it's just "barely functional". Right now even basic stuff like setting the pacing of clips relative to the beat is malfunctioning.

Automated assembly is the final milestone of the 8 series Save Point iterations, and once it's fully functional at a basic (unintelligent) level, that will be SP version 9. Once this program is truly intelligent, that will complete the Save Point tech branch, at 1.0.

I'll upload more examples tomorrow, so people who are interested can see how it works in different circumstances, but it's hour 17 of today so I'll leave it for later. I'll likely have an improved version that handles clip length better sometime this week.
 
Last edited:
Upvote 1
I promised to add more videos here to show how this works a little better. I've been slow to do that because I've been working on fixing a lot of issues with it and also increasing the resource pool for this particular sub project. So far I'd say I'd spent maybe 400 hours on the main pipeline programming that makes this possible (from 8.7 to v9), and maybe 30 hours on the actual "Blue Steel" sub project.




I've still got a few more days work before this is permanently finished, Including adding letterboxing, reframing, Fixing some timing issues, etc.

These videos still use lower quality and have a significant number of minor errors.

Still even if I stop right now with this one, here is what the milestone is.

I've engineered the main pipeline up to a point where I can create an entire youtube channel in a day. This channel can upload a brand new video of a brand new song every day 365 days a year without me having to do any additional work. There are currently about 240 of these videos that I ran off as tests while I was building this first vortex. Now I'm focusing on just raising quality across all videos produced.

This is a prototype project to help me develop an extremely important part of Save Point, that will allow me to automate the creation of lesser scenes within the fiction, Specifically when a user chooses to do an inane activity such as watching television, Attending sporting events, Playing video games, Fishing, etc. It can also be used to create scenes such as driving segments, People playing pool in a bar, Or basically any type of scene that is simple and obvious enough that it doesn't actually require any human talent to direct it. If you've seen a car commercial, You understand that genius level intellect is not really required to simply pick out a few shots of a certain car driving down a road at a certain time of day. This type of technology frees me up to spend more time, budget, and focus on important parts of the script such as human interactions with consequences or significant plot reveals.

It's not the first time that a video game tried to include some kind of minor diversion to expand the open feeling of its world. In the past however these flourishes were typically very limited because not many players engaged with them and they cost the companies producing the flourishes money that they would never really get back. The bottom line is that Save Point will be the first game in which you could decide to sit down and watch television instead of pursuing the plot, And conceivably flip through channels and watch tv shows inside the game forever with an infinite parade of content across a limited number of channels and shows. It allows an unprecedented amount of freedom and detail that many past games have attempted with only very limited success.

This is what gave me the idea originally back in 2001 I think. I'll be running a number of these minor sub projects over the next few weeks as I develop the core system in the pipeline. The goal is to produce three long running TV shows this month and then move on with both the system and the shows completed.

 
Last edited:
The concept is very cool, but these videos don't convey any progression.
You could randomly reassmeble these clips together in any order and I don't think I'd be able to tell a difference in the end product.

They're all just a dude with a guitar?
You probably want to try more demanding examples to help debug and test the software.

The bottom line is that Save Point will be the first game in which you could decide to sit down and watch television instead of pursuing the plot, And conceivably flip through channels and watch tv shows inside the game forever with an infinite parade of content across a limited number of channels and shows.

 
I probably didn't explain it too well. I did mention it in I think the initial post. This version simply concerns itself with being able to assemble videos, At all. The same program also does quite a bit of still image sorting and even visual recognition and sorting of videos. However right now there is Only a little bit of logic in terms of how the videos are sequenced. These vague setups where the videos don't have to be put in any particular order are the only thing it can do right now at V9. I've already been building the logic for the real thing into it for quite a while, And when that's finished that will effectively be SP 1 .0. Basically it's a very significant challenge to get this running in the first place just at a functional level.

This version can be used to make supercuts, Simple video feeds, Music videos, And a lot of other things that don't specifically require a chronological plot structure. There are a huge number of things where this would be useful. An example would be the Lords and ladies soap opera featured in Max Payne that I showed above. It features interchangeable points of plot progression that make the audience feel as though they are watching ATV show when they walk by the TV in hallway or watch it for just a few minutes. It would not be a viable program to watch for its story. In my opinion those videos need to be directed by human.

However, The program will become vastly more capable as I begin to apply more and more intelligence to the assembly process which will essentially be what I am doing between SP version .9 and SP version 1.0. So right now it can make infinite music videos or simple tv channels like shop at home that will appear valid to a viewer unless they watch it for a longer period.

I've already spent some time setting up a resource pool for the next phase of testing, Which is a soap opera called "The Time of our Days". It will feature minor directing and staging abilities like switching back and forth between two people having a conversation, And creating episodes with a beginning middle and end.

This whole thing is a monumental task which involves a lot more than just AI, But also involves a ton of ai. There are absolutely zero AI's on the market or in open source capable of directing any video, So this is a From the ground up process, Involving many months of design and programming. Right now the plan is to try and have it functional by the end of summer at 1.0.

What this ultra simplistic version is doing is laying the groundwork for that version and also giving me away to create a bunch of cheap and easy channels that generate income to feed into the project so that it can go faster.

The cool thing to me is that it's infinite and it's easy to improve. Every time I sit and work for one day expanding one of these infinite worlds, It increases in quality and coherence rather than in size. The day will come when this robot blues concert is probably epic and barely ever repeats itself.

Right now I think this is about 3500 lines of python code, Not counting models and libraries. By the time it does what I think you're talking about I suspect it will be 20,000 lines of python code.
 
Last edited:

Blue Steel is the first micro spin off channel I made to help test and develop the early stages of SP v .9. It's a minor project but it seems to be doing okay these first few days. About 1000 people have watched it and about 200 have subscribed. To put that in perspective the save point development channel itself got about 50 subscribers in the first three years. I think it's close to seventeen hundred now, but it took a long time to take off.

I've used the pipeline to automate the whole process at this point, and I'm just adding a new song and performance every day at 7:00 PM forever. It'll keep improving for a while but then I'll just set it to cruise control and jump out of the car.

Worked about 14 hours yesterday on the main pipeline and got the most basic form of intelligent subsections working. So moving forward these videos can have an intro, a guitar solo, Establishing shots, Basic plot progression, an outro, stuff like that. This is the foundational stuff for making scripted television later. I'll work my way up the chain as the program gets smarter, Music video, then soap opera, then reality tv, then Western, then crime drama, etc. The singularity happens once I've gotten it to self teach and use agents together its own examples to learn from.

Also engineered infinite environmental sound. Lay down as many Foley tracks as I want and determine how often they occur at what volume etc, Complete automation of fade in, fade out, cross fade, How often you hear a track, how random it is. So basically wind, rain, traffic, thunder, house noises, and so on can be developed and saved as infinitely regenerating environments that are never the same twice but always convey the same feeling. This allows me to create a perfectly mixed environmental sound beds that never repeat. No ai for this section, Just procedurally generated background foley made from recorded sound effects. I'll post a demo of that later once I have a good one, Just got this feature running yesterday so I haven't had time to experiment with it yet.

Anyway this is the beginning of that explosion I've been talking about for years. I've published about 400 videos in the last four years. In 2025 I expect to publish over 3000 videos. Don't worry i'm not going to try to get you guys to watch them all, lol.
 
It's just a one off, The easiest possible test I could think of for a new piece of software in its infancy. Save Point's gone through almost four years of development but the vortex extrapolator is less than one month old. Compared to norms, progress is moving very fast. Blue Steel isn't going to be a major thing with save point, it's just a quick test project like cloth world. At nearly 400 completed videos produced it probably seems like I'm really going in on this thing, But that's just a testament to the power of this new system, I actually spent more hours of labor working on clothworld than I did working on blue steel.

At this point if I wanted to I can just change a few settings in the interface and create 60 cloth world videos overnight, lol. That's gonna be significant when a player in save point wants to go and watch the Indianapolis 500 or something. There's no way I was gonna sit around animating cars racing in a circle for six months to make one choice available.
 
Back
Top