software-related AI Video has arrived - It looks incredible

Anyway, the TL;DR is: they might already have an AGI in the lab.


While it's unfortunate white collar, or cognitive workers, if you prefer, might soon be replaced by AI, the ugly truth is blue collar workers have been being replaced by automation for decades. The death threats and calls for Butlerian Jihad seem just a bit much. Maybe they should learn to code. Oh wait.

I'm sure we'll all believe it when we see it. Until then…
My discipline was computer science and I can give a professional opinion there.
They do not have AGI. I'll create a different thread for UBI.

I kinda went off topic myself there for a post, but I want to redirect that back to Sora and keep us on topic since Sora in particular affects the film industry so profoundly.
 
Last edited:
and just like that Celtic's complaint about not recognizing all the prompt details has been conquered.
:huh: Conquered? Has it? I'm not seeing a single example of a complex multisubject text prompt being turned into a video clip, so I'll hold off on my congratulations for a little longer.
 
What happened to blockbuster?
Blockbuster was a bog-standard tertiary commerce, about as far away from "the movie industry" as it was possible to get while still having something (anything) to do with movies. They provided a means for customers to watch movies of their own choice in the comfort of their own home, a commercial model that is alive and well and no doubt streaming to a device near you at this very moment. Just because some random CEO mis-read the writing on the wall regarding the long-term viability of renting VHS tapes to a fickle consumer market doesn't mean that every other "disruptive technology" is going to wipe out everything that's gone before.

The first operas were performed in the 1600s, the first opera house opened in 1637, and guess what? 400 years later, despite all the technological advances that our species has brought to the world, there are thousands of performances every year all around the world where singers and musicians perform to an audience of 2000 people and, believe it or not, they don't even use microphones. Backstage, opera houses are teeming with costumiers, electricians, welders, carpenters, painters, graphic designers, sound and lighting technicians, IT professionals, voice coaches, choreographers, make-up artists, truck drivers, and many more, all using "modern technology" in their work. There's little risk any of them are going to lose their job to an algorithm, though.

The other day, YT suggested this video to me:
At 8m22s the presenter talks about "AI fatigue" - that we humans (even the non-creative types) will become so bombarded with AI's supposed creativity that we'll assume everything Wowww! is computer generated and dismiss it. That - in my opinion - will push people back towards live events of all sorts, and/or towards movies that have been crafted in the traditional way rather than churned out of an industrial video generator.
 
@CelticRambler, you got a point. Blockbuster has given way to streaming, but people are still buying or renting movies, just in a different manner. So the sales clerks at Blockbuster have been replaced by people who work at Netflix and Disney+.
Yeah thats why the music industry is the better example.
Nobody is buying CDs anymore, all music is free for me, and basically only a small niche of people are buying vinyl.

To me this is like giving everyone a professional cinematographer, once its mature.
Finally everyone has a chance to work with aprofessional and get images that look amazing.
 
Last edited:
Nobody is buying CDs anymore, all music is free for me
If "all music" is free for you, does that mean you're excluding all non-free music from your listening experience?

And does "Nobody" mean everyone - because I for one still buy CDs, almost always from a box on the stage where the artists have just been performing, and usually I'm just one somebody in a queue of nobodies happily handing over a 10€ note for the pleasure. One of my favourite groups are recording their newest (third) CD in a few weeks, and they've invited those of us on the mailing list to come along and add to the ambiance ...

Then again, if you refer to the music industry, well that's not at all the same thing as music, just music. Again, despite several hundred years of technological advances, there is still no machine, no computer, no technological whizzbangery that can replicate the experience of listening to real music played by a real musician in real life (no mics, no amps, no autotune ... ) So yeah, sure, the purveyors of cheap disposable muzak might go the same way as itinerant knife-grinders and the hawkers of smelling salts, but again that's just the continuing evolution of trade and commerce that's been going on since the first neanderthal sold a pointy stick to his neighbour.
 
If "all music" is free for you, does that mean you're excluding all non-free music from your listening experience?

And does "Nobody" mean everyone - because I for one still buy CDs, almost always from a box on the stage where the artists have just been performing, and usually I'm just one somebody in a queue of nobodies happily handing over a 10€ note for the pleasure. One of my favourite groups are recording their newest (third) CD in a few weeks, and they've invited those of us on the mailing list to come along and add to the ambiance ...

Then again, if you refer to the music industry, well that's not at all the same thing as music, just music. Again, despite several hundred years of technological advances, there is still no machine, no computer, no technological whizzbangery that can replicate the experience of listening to real music played by a real musician in real life (no mics, no amps, no autotune ... ) So yeah, sure, the purveyors of cheap disposable muzak might go the same way as itinerant knife-grinders and the hawkers of smelling salts, but again that's just the continuing evolution of trade and commerce that's been going on since the first neanderthal sold a pointy stick to his neighbour.
Yeah I think I am missing out on the insane clown posse, but that's not a big loss for me.
Personally I do not buy ANY movies anymore, I haven't bought a movie or a CD or album or anything in decades.

That's good some people do, but the industry has most definitely been massively disrupted. IDK how anyone can deny that.

:huh: Conquered? Has it? I'm not seeing a single example of a complex multisubject text prompt being turned into a video clip, so I'll hold off on my congratulations for a little longer.

You're funny man. It was announced in beta by Stability AI, unavailable to public, hasn't been implemented yet by open AI, these things take time to implement after the solutions are made available. 😆

It's cool we have different opinions, makes things interesting, but there's no point in arguing about.
Time will prove one of us right and one of us wrong. Lets wait and see.

Much to my chagrin I've been wrong before in my life, shit does happen lol.
I'll be prepared to eat my crow if it comes to it.
 
I'll put my metaphorical money where my mouth is and throw out an easily verifiable prediction with a timeline right now.
By the end of 2026, I predict almost every new commercial on TV will have been made with AI video technology instead of filmed.

The entire subset of commerical filming, especially for local businesses, will have been eliminated.
Now we sit back and wait.

here we go batman GIF
 
By the end of 2026, I predict almost every new commercial on TV will have been made with AI video technology instead of filmed.
I can believe this more easily than having "all" movies made with AI video technology any time soon because no one gives a damn about artistry in commercials - only making money matters. And that's why I think that, even if big budget movies go that route, there will continue to be a universe of indie movies that use actual creatives to tell stories because that's where artistry DOES matter.

We shall see.
 
I can believe this more easily than having "all" movies made with AI video technology any time soon because no one gives a damn about artistry in commercials - only making money matters. And that's why I think that, even if big budget movies go that route, there will continue to be a universe of indie movies that use actual creatives to tell stories because that's where artistry DOES matter.

We shall see.
Thats true but it also depends on the kind of movie you want to make.
We see it all the time these days, where something would be cooler if it were a practical effect, but it's done with CGI instead because it's cheaper.

Alita Battle Angel was opted to be made entirely in CGI because of all the effects they wanted to do with their story.

Edit to add:
Twister, 2012, or a TV show for The Flash are all great candidates.
it was insanely expensive to film that X-Men Apocalypse scene with Quicksilver running fast among slow-mo ppl.

IMO the AI faces are the easy part of acting. The difficulty is in voice, that is the most nuanced.
I think it will be trivial to adapt the mouth to a pre recorded voice line once this gets up and running, so there's the potential for a lot of photorealistic cgi movies using voice actors in a near future. It will be a boon for voice actors. Even when they 'conquer' AI voice, it will most likely be done through copying and repliciting an individual, and good voice actors will be able to sell their likeness with a fat contract.
 
Last edited:
🤔 What are these "commercials" of which you speak?

I only see them when I go to my parents house 😄
Here is one that I saw today.. so many commercials are like that with a B-Roll + Voice over


They also have a lot of hallmark movies on, because of my grandma, and when I watch those I can't help but picture them all made in AI now.
The acting is so mediore, the lighting is so bad, they will actually be BETTER movies when they're made with AI.

It's really not hard to imagine AI acting at least as good as these people on hallmark.
 
Last edited:
The problem with VR / Virtual Reality is that the resolution is terrible.
By designing a custom piece of hardware, using AI tech, that takes a low res image and blows it up to super high res, we can ELIMINATE PIXELS [EDIT: Meant eliminate pixelization - not pixels] in virtual reality and only add a tiny bit of weight to the headset. Add one chip to each eyeball display, boom, you're done.

Wow VR is finally going to take off.
 
Last edited:
:huh: Is that your own assertion, or are you quoting from someone somewhere?

Either way, there's a giant incoherence in the proposition: visual "resolution" is measured in pixels, so if you eliminate them, you inevitably reduce the image resolution. Given that our eyes function at a resolution of 576Mpx (ISO range 1-80000) is this AI-designed custom-hardware sending signals directly into our visual cortex, bypassing our eyes entirely?

But the proposal is to add a chip to each eyeball display, so no: it looks like the concept comes back to stimulating our rods and cones, requiring a high-definition, megapixel display. Something doesn't add up. 🤓
 
Last edited:
:huh: Is that your own assertion, or are you quoting from someone somewhere?

Either way, there's a giant incoherence in the proposition: visual "resolution" is measured in pixels, so if you eliminate them, you inevitably reduce the image resolution. Given that our eyes function at a resolution of 576Mpx (ISO range 1-80000) is this AI-designed custom-hardware sending signals directly into our visual cortex, bypassing our eyes entirely?

But the proposal is to add a chip to each eyeball display, so no: it looks like the concept comes back to stimulating our rods and cones, requiring a high-definition, megapixel display. Something doesn't add up. 🤓
Yeah I phrased that incorrectly, and it ended up coming out wrong and making me sound dumb.

To Clarify - My own intended assertion, was that it can get high enough resolution to eliminate PIXELIZATION, get it high enough to make the pixels indistinguishable to the human eye, like an apple retina display. you're right there will still be pixels in the literal sense, i said that wrong.

Right now the VR displays are too pixelated to be pleasant, especially any object far away looks like a glob of pixels, it looks like minecraft a mile away. We can eliminate that eyesore with hardware based image upscaling and actually make VR look good for once.

It's a bit tricky, one of the trickier things I've proposed from a technical sense, because it comes down to how fast the aI hardware chips actually function in that scneario. It's a real possibility though IMO.
 
Last edited:
Ah, OK. That makes a lot more sense. Making you think you're seeing pixels where there aren't any is a reasonable use of an algorithm - but it's still only mimicking what our brains do with UHD images coming in through our own eyes.

I haven't really paid an awful lot of attention to VR developments, but from the little I've picked up, it seems that the technology struggles with low levels of ambient light too. I'd also wonder if our ability to rapidly re-focus from near to far and back to near again will be a limiting factor in recreating an artificial 2D-pretenting-to-be-3D representation of the wearer's immdiate surroundings.

I can imagine a situation where that whole image-capture-to-visual-display step is completely bypassed by a process that feeds the supplementary info directly into our optic nerve. That's the kind of leap that'll come at the IT industry from the biomedical field and could render (:blush: ) a lot of VR headset technology and research obsolete very quickly.
 
from the little I've picked up, it seems that the technology struggles with low levels of ambient light too.
The trouble with low levels of light is the tracking, it uses cameras to watch your hands and controllers, but check this out!


They've been working on this for years already, and say it'll be ready for 2027 in the Quest 4.
With this device there is no more trouble with low levels of light tracking, because it reads the signal directly from your wrist.

As far as VR being mainstream, it needs 2 big things - Comfort and Resolution.
Once it becomes comfortable (lightweight) while being high resolution, it will be embraced.

Right now mainstream VR is roughly 1832 × 1920 resolution per eye, and then you basically put a huge magnifying glass on top of it so that the screen dominates your field of vision. Once you have that headset and magnifying glass on, you can see the pixels...

Watching a video of a VR youtube game on your 4k monitor is NOT what the game will look like if you put a quest 3 on.
Essentially we need a screen the same size, but with a resolution of 6,000 x 6,000, and then we won't see the pixelation anymore, however current hardware is nowhere close to be able to handle that size of a computation with frequency.

In come the AI chips... if we can transform a 2k image into a 6k image in .1MS, suddenly you are offloading all of the slow, time consuming computations into the new age AI chip.
 
In come the AI chips... if we can transform a 2k image into a 6k image suddenly you are offloading all of the slow computations into a fast AI chip.

Holy crap everything is happening so fast. I just talked about this yesterday!
Working with nVidia now to offer seamless upscaling of games into super high resolution 😳


It begins with PC then moves to VR

Oh My God Wow GIF
 
Re. the AI, I join my fellow Luddites, mlessman and Celtic Rambler. It doesn't bother me. I don't need it. I don't want it.

Although, for people who actually construct films, and probably especially for animators, I can kind of get why it could be troublesome. But for writers, I just can't believe it can ever be any competition.

What the AI will almost certainly discover (fine, certainly discover; fine: 'dude, already there'), are the algorithmic steps designed to tell a story, or, more importantly, to create an emotional experience in a viewer/reader--a genetic core formula extracted from billions of examples, with blanks to be filled in with whatever: a sentient dog, a paranormal chosen-one teen, with whatever.

But writing that is too much like other writing is by definition bad, even it it is technically correct. And this is all the robot can do.
 
Last edited:
Back
Top