A Proper Gander At Propaganda


"Propaganda in the United States is spread by both government and media entities. Propaganda is information, ideas, or rumors deliberately spread widely to influence opinions. It's used in advertising, radio, newspaper, posters, books, television, and other media."  -  Propaganda in the United States - Wikipedia

"A man without a government is like a fish without a bicycle.” Alvaro Koplovich

"The rise of modern print culture The printing press and the relatively wide availability of printed materials further undermine the importance of the 'local community' in ... Feudal societies based on face-to-face loyalties and oral oaths begin to give way to nation-states and to nationalism based on a shared printed language." Marshall McLuhan: Theoretical elaborations

"The printing press gave rise to nationalism and nation states while the Internet is helping to create a world community."  Understanding New Media: Extending Marshall McLuhan

About A Proper Gander At Propaganda:

This is not another conspiracy website. This website exists to serve as free public resource for reverse imagineering world-wide culture. I, AA Morris, am merely the curator of this website and blog. You need no permission from me to reproduce anything you find here. Simply source and credit those who obviously deserve credit as we are all supposed to do. You need not credit AA Morris. You can use anything I write and any art work or video I produce as you like, without needing to credit or source me. This website should be considered a virtual museum of sorts, a gallery of art and other types of relics that represent our shared world-wide cultural heritage.

Thank you for your time,

AA "The Proper Gander" Morris

Article index

International Space Station: Gravity Fails


*Article updated February 8, 2018.


PRESENTS: An Internationally Sponsored Hoax We Pay For


There are many ways to fake "Zero-Gravity" that do not require either flights of unreasonable fantasy or parabolic arcing flights. Down to Earth Hollywood stage magic illusion and both old school and new school techniques and technology are the real science used to create images of human beings doing the impossible. International Space fakery is no different from any other Hollywood style film or video special effects production. Our internationally derived taxes go not to protect us from any advertised phony phantom menace, our tax money goes towards things like building a world wide chain of Coca Cola bottling plants, international hotel chains, the infrastructure for pop celebrity concert tours and research and development of video game and Hollywood special effects technology.

This article looks into the secret "key" to a really great "Zero Gravity" production.

The above image is a mistake. This kind of continuity error is a clear indication we are watching nothing but the result of a Hollywood-style, video production. The International Space Station is not real. As you can see the tortilla is indeed subject to gravitation. "Outer Space" is only an imaginary place.

Today's images of "Outer Space" are illusions and the result of old school stage magic tricks and new school CGI illusion.

image source: Cooking in space: whole red rice and turmeric chicken •  European Space Agency, ESA  •   https://www.youtube.com/watch?v=4exaXdPKS3Y

(Please excuse the inevitable typo. Auto spell correct is an evil, digital gremlin. AA Morris)

Outer Space is not a real place: It can only exist as art of some kind.

The only way anyone can interact with the idea we call "Outer Space" is by virtual means.

There are numerous problems with "Outer Space Science" that make the feats of international space agencies impossible. All the evidence we can examine clearly indicates there is no reason to believe "Outer Space" is anything but fantasy. Space Programs are clearly part of a long standing international hoax that relies on the mass public having faith in the claims of superhuman feats of government. People must also buy into World Wide News media hype that claims the News is independent from  Governmental influence and control, when nothing could be further from the truth as anyone who simply uses the internet as the amazing library research tool it is, knows. The World is simply not run as advertised. History is not what we think it is and the fact of the matter is that myth and celebrity and idolatry matter more than the actual and Natural facts of the matter. History is made of many lies and so is what passes for "science". A lot of what people think of as "science" is actually propaganda with no actual physical demonstrable support.

The Secret To A Really Convincing "Zero Gravity" Shoot: An Invisible Green Screen Assistant

A real life "Invisible Man". There is no reason to leave the safety of governmentally contracted and secure studio to create the illusion of "Outer Space" and "Zero Gravity". CGI water and other such effects are nothing new. What is new is the public availability of real time CGI effects. I do not think it a stretch to consider the very real possibility that believable CGI technology is older than we might think; but here's a secret of sorts, we do not need computers to craft believable darkroom effects. A skilled artist could always create believable dark room illusions. Photo fakery and "Photoshopping" is as old as film itself. What one can do with still photography, one can do with moving imagery. Believable visual illusions are as old as photography itself.

Visual Magic

There are many techniques that the skilled visual illusionist can rely on to make the impossible seem to be real. Such illusions can only exist in a virtual "space" of some kind. The future looks to be one of virtual worlds, some sold to the public as real,  visited by the medium of virtual reality goggles and earphones. One day you will be able to log onto NASA's Mars Colony Mission website and you will be virtually transported, with 360 degree visual fidelity, to what NASA will claim is another world.

green key secret

An Invisible Assistant Will Enhance All "Zero Gravity" Productions

The secret green screen "key", an invisible ally. Believable "Outer Space' imagery can be created from the safety of the green screen studio. In the hands of the skilled team of artists, fantastic ideas can be realized. Only but a handful of creative staff would be needed to complete the production. Of course, non disclosure agreements and National Security Clearances, cover for many special effected sins. National Security Law and Federal and Military prisons, and death penalty punishment for traitors to the cause, are real, "Outer Space" is not. Government funded projects of this caliber would pay well; and getting a governmental propaganda based contract would seem to be like finding the proverbial leprechaun's pot of gold at the end of allegorical rainbow.

The tax payer foots the bill for the creation of powerful visual propaganda that Governments of the World sell as real. All part of a very real and very profitable Atomic era Cold War hoax that continues to this day.

image source: International Space Fakery and http://chromagreen.ca





image source: YouTube Stunt Training Wire work Vickyhahaful and NASA film special effects work.

"LimoStudio Photo Video Chromakey Green Suit Green Chroma Key Body Suit for Photo Video Effect, AGG779"

"Specially designed for Photo or Video Special Effects Works Perfectly most of Green Muslin Backdrop"

source: https://www.amazon.com/LimoStudio-Chromakey-Chroma-Effect-AGG779/dp/B005FMOX9M


fd2 (


The "Green Man"

image source: http://knowyourmeme.com/memes/green-man

"We offer an effective and affordable chroma key suit option"

"We offer inexpensive and reliable full cover body suits, gloves and hoods for green screen filming and video production purposes."

"All of our chromakey suits and accessories are made of a stretch fabric that allows for easy mobility while being close fitting to avoid wrinkles and shadows. Based on your production requirements, you can choose between the two types of green chroma key suits or blue chromakey suits."

source: http://www.chromakeysuit.com/


image sources: https://twitter.com/austinkleon/status/752298066569420800  •  https://www.ronbinion.com/puppeteertvfilm-1/2017/10/3/puppeteer-overview

The Magic Of The Chroma Key

"More commonly known as green screen or blue screen (though that one also has a different meaning), the process by which a subject filmed on a camera can be seamlessly inserted into a scene generated by other means. It relies on filming the subject in front of a solid-color background — any color will do, so long as it's not used in the foreground — and adjusting the editing system to replace that color with the background signal.The main methods of controlling the background for live-action shots are, in increasing order of technological sophistication: Finding one that already exists, and film on location. Build a set. Film the background separately, and project it onto a screen behind the actors while filming, typically via rear projection. Double-exposing the film, which results in a slightly transparent foreground but is cheap. The old analog Matte Shot, done with precise blocking of the camera frame."

"Chroma Key. The background inserted via Chroma Key can be any visual image. CGI is the most common today, but it can be other live action footage, models, stop motion or cel animation just as easily.The color used is now entirely arbitrary. Blue was a popular choice in the early days of color motion pictures, because it is complementary to the reds found in human skin. Green became popular because digital editing systems can isolate green with less light in the background, and because lime green is less common than bright blue in costuming note  Magenta is sometimes used, as is black, but the latter is problematic, as it's almost impossible to shoot a person without having some black visible on their person, in eyes or shadows."

"If any part of an actor or prop is colored the same as the background, that part will disappear. Thus, sometimes the background color is chosen because of the colors to be used in the foreground action. The original run of Doctor Who, for instance, used green or yellow backgrounds even when blue was the most common color at The BBC, because a large number of its effects shots involved the TARDIS, a timeship that takes the form of a blue police phone box. The problem with using yellow was that foreground objects and actors always had a prominent yellow fringe around them. Normally, wardrobe and prop designers simply avoid using greens in the capture range, but this is not always possible; you'll occasionally see bloopers where weather forecasters have part of the meteorology map show up on their ties, for example.The invisibility effect can be used intentionally to allow a performer, or part of his body, to interact with props while remaining unseen. A garment that can be used for this purpose is a one-piece jumpsuit in the background color, with a full-face mask, and a mesh eye piece, called a "gimp suit" or, in the case of a blue background, a "blueberry" in the trade. Performers in recent Jim Henson Productions shows have used these suits to perform with puppets without having to raise them above their heads. The suit looks like a Ninja outfit, and that is not a coincidence, as it serves the same purpose as the black outfits traditionally worn by Japanese stagehands. See notes at Ninja.""Almost all productions use Chroma Key at some point, but there are some standout examples. Also notable for causing occasional unintentional hilarity - when background and foreground are poorly matched, or the SFX budget is low, the effect is anything but seamless.It can be fairly tricky to create a viable Chroma-Key effect, especially with amateur equipment — often, it requires fiddling with hue and saturation, and even then, there is often a faint, tell-tale 'border' around the subject where the green-screen footage and the 'real' actor don't match up."

source: http://tvtropes.org/pmwiki/pmwiki.php/Main/ChromaKey

Crew-credits-on-Alice-in-Wonderland (

image source: https://stephenfollows.com/film-crew-work-on-set-or-in-post-production/

Case Study: Early 1970's Low Budget BBC Chroma Key Magic

"Letts agreed to produce Who if he could still occasionally direct, and so “Terror of the Autons” became his return to the control room. It is colorful, garish, and looks downright weird. Letts fell completely in love with the blue-screen chromakey tech, and used it for every possible special effects shot, including creating places like museums, and, in part two, the kitchen of a small house. It’s not that it all looks fake necessarily – although it certainly does – it’s that it looks amazingly unreal. You aren’t supposed to look at the environments of a television narrative, even environments as seen in modern TV that are created entirely by CGI over green-screen sets, and ask what in the world you’re seeing. You go to the movies and intuitively, immediately, instantaneously understand that Benedict Cumberbatch, dressed as Dr. Strange, was in a studio and computers filled in the weird world around him. You see the actress Barbara Leake in front of what seems to be a kitchen and it all looks so amazingly wrong that it becomes weirdly unsettling."

"And it keeps happening! The same scene includes a pair of shots where the troll doll, an Auton weapon used by the Master, is played by an actor in a suit CSO’ed onto the “sitting room” set, and the special effects people can’t get the doll’s size right in relation to the furniture. So the milliseconds that it should take to process these weird images get a little prolonged. It doesn’t seem “bad,” to me; it seems “wrong.” "

source: https://firebreathingdimetrodon.wordpress.com/2017/07/06/doctor-who-terror-of-the-autons-part-two/



"TV production, composition and storage is now entirely digital, so computers are a necessary and inherent part of the production process. No so in the 1960s and 70s, during the classic series’ lifetime. Back then the final product was analog: two-inch Quad videotape masters made from edited videotaped studio footage and telecined 16mm film from location work and model shots By the end of the Doctor Who’s initial run, computers were already being used in TV graphic design, model photography and video effects. Think, respectively, of Oliver Elmes’ title sequence for Sylvester McCoy’s Doctor, of the motion-control opening Time Lord space station model shot of the mammoth Trial of a Time Lord season, and of the various pink skies and blue rocks applied to extra terrestrial environments during the Colin Baker and McCoy eras. Yet all these computer applications ultimate still resulted in analogue footage. A sequence shot on analogue videotape would be digitised, tweaked in a gadget like Quantel’s Paintbox rig, and then converted back into the analogue domain to be edited into the rest of the (also analogue) material. During the early 1980s (1983 in the case of Doctor Who) the BBC moved from two-inch Quad tapes to the more compact, more sophisticated one-inch C Format tape, but it was still analogue."

source: http://www.theregister.co.uk/Print/2013/11/08/doctor_who_telly_special_effects_tech/

"CSO behaving badly"
Source: BBC

"PAL’s colour signal transmitted red and blue colour levels - ‘chrominance’ - with the green level derived by taking the combined red and blue signal strength at a given point away from the luminance level at that same point. That saved bandwidth. So did transmitting colour data for effectively half the number of lines, though that was the result of PALs alternation of the colour signal’s phase to help correct phase errors by cancelling them out. Fortunately, the human eye’s colour resolution is less effective than its ability to detect changes of brightness, so the lower colour resolution was not a problem. One upshot of all this became known by BBC engineers as Colour Separation Overlay, or CSO. Engineers working for the broadcasters on the independent network called it Chromakey. Whatever the moniker, the essence of the technique involves substituting one picture signal for part of another, the substitution keyed to a specific colour, hence the ITV name. The signal from the foreground camera was run through a ‘key generator’ which essentially created a video mask derived from the key colour. After the key signal had been adjusted for hardness or softness, it was used to control the video mixer being used to combine the output from the foreground camera and the signal from the background camera."

"Unlike CGI, the CSO mixing was done live, before the shot being recorded, so any errors remained for all to see. Studio-recorded TV in the 1970s and 80s generally lacked the time to remount shots with iffy CSO, so directors just had to live with glitches, of which there are many to be spotted in Doctor Who episodes from the period. We’ve all seen them: dark or coloured lines at the borders between keyed picture components; areas of the foreground picture appearing to disappear, the result of studio lighting casting light of the same hue as the key colour and incorrectly triggering superimposition, or the key colour backdrop itself reflecting off a shiny object; and of course hair or glass not looking right because the intensity of the key colour had dropped out of range while passing through it. The key colour - usually yellow in early 1970s Doctor Who episodes, and blue later on - had to be carefully selected to avoid replicating natural human skin tones. With the colour set, parts of the set into which separately shot images would be later dropped - radar screens, windows, stuff like that - would be painted in the key colour. If the lighting was a bit off, the apparent colour intensity would fall below that the mixer was expecting to key an image to. Costumes had to be designed to ensure they didn’t include the key colour. Naturally, that didn’t always happen."

"And sometimes Vision Mixers, the guys charged with making the effect happen, simply forgot to key in the image to be superimposed. Watch out for unexpected large areas of yellow or blue behind the Doctor or others. Examples are legion, particularly when yellow was used as the key colour - blue might just occasionally be mistaken for sky. With time for patient, careful set-up, CSO could be quite effective, and there’s plenty of good CSO in classic Doctor Who too. The best examples are barely noticeable. But time was not a plentiful commodity during the recording of a Doctor Who episode and the results could be very poor indeed."

source: http://www.theregister.co.uk/Print/2013/11/08/doctor_who_telly_special_effects_tech/

"Enter Computers, stage left"

"CSO had some inherent limitations too. Before the advent of motion-controlled camera rigs, it was impossible to move the cameras photographing the shot’s two components in perfect harmony. Nudge one and the background, say, would move while the foreground didn’t. Many a nicely keyed shot was spoiled by this kind of disjointed movement. It meant that CSO really couldn’t be used for anything other than static shots.  It didn’t stop directors from trying, of course. A common technique was to film stock footage of smoke or rain in order to superimpose it on a previously recorded tracking shot. But while the background shot reflected the camera’s pan, the smoke or rain shot didn’t, rather giving the game away. Of course, you could make exactly that kind of mistake with the laborious optical printing process used by the movie business at the time, and many films did, but it seemed worse somehow on the small screen. Eventually, Evershed Power Optics engineer Reg King came up with Scene Sync, a motion control system which could co-ordinate two cameras’ movements to allow CSO background and foreground shots to move in harmony. A motion detection rig fitted to the first camera picked up pan and tilt movements and relayed them by cable to a slave unit which controlled the second camera. With careful calibration, the system could scale down movements to match the different scale of the subjects being shot. A camera recording a background model at 1:10 scale had to be panned a tenth as far as the master camera moved in order for the movements to match up. Scene Sync was first tried on Doctor Who during the recording of Season 18’s Meglos - which was also the only time it was used on the show. The production team allowed the story to be a guinea pig for the new technique, in exchange for which they got to use Scene-Sync for free. While the Meglos experiment provided valuable experimental data that would be used to refine the technique for other shows, including The Borgias and Jane at War, the Doctor Who production team found the process’ manual calibration too time consuming to rely upon it."

"In any case, just as computers were being used to control camera movements - a trick that goes back to the mid-1970s when it was pioneered on the likes of Star Wars - they were also moving into the video effects arena. As we’ve seen, Newbury, Berkshire-based Quantel launched Paintbox in 1981, and it soon became easier and cheaper to do picture composition and adjustment in the digital domain. Paintbox comprised custom hardware that could grab two video fields, digitise them into a frame and store it. A keyboard, a tablet and a stylus was all a trained operator needed to combine video sequences, adjust the colours and paint in new ones - hence the name. In 1982, Quantel introduced Mirage, a box which allowed frames to be replicated, scaled, distorted and bounced around the screen. TV opening titles would never be the same again. As Paintbox quickly defined the look of early 1980s pop videos, so Mirage defined how a decade of television was presented. Doctor Who benefited in particular from Paintbox, not merely because it soon proved a more flexible, more efficient alternative to CSO, but it allowed those archetypal quarries-as-alien-landscape shots to be made to look even more extraterrestrial by colouring the rocks blue, painting the sky purple or dropping in shots of smoking volcanoes in the background. It also allowed director Lovett Bickford to pull Tom Baker to bits in The Leisure Hive."

source: http://www.theregister.co.uk/Print/2013/11/08/doctor_who_telly_special_effects_tech/


dw_11 (

image source: 1974 BBC "Doctor Who"  https://en.wikipedia.org/wiki/Robot_(Doctor_Who)

Before CGI There Was "CSO": A Look At Early 1970's Television Technology

"CSO is the common acronym for a special effects procedure championed by producer Barry Letts during the Jon Pertwee era, but continued to be used by the programme for a considerable time thereafter. It was created using an optical printer, although later methods, especially from the early '80s onwards, were done digitally, with a computer (the first one to be used for that process being the Quantel Paintbox, whose use in the programme began in the 1980 story The Leisure Hive). It allowed two different live shots to be mixed together to create the illusion of the two elements being in the same shot. The acronym itself stands for colour separation overlay. Outside the BBC, the process is generally known as chroma key."

"As the name implies, it involves separating the colour of the background from one shot then overlaying that now-"background-less" shot onto another one. Usually the colour of the background will be green, blue or yellow, because these colours are not natural human skin tones. The mixer then "keys out" that colour, replacing it with the image of another shot. While this "greenscreen" process is a common building block to other special effects processes, like CGI, what distinguishes CSO is the fact that the two shots are mixed together and then recorded live. Any errors of alignment, scale or colour separation are committed to the shot."

"The procedure fell out of favour in dramatic presentations largely due to its inability to depict appropriate perspective between the two shots. The limitations of scale inherent in the process can clearly be seen in numerous serials, Invasion of the Dinosaurs being the most infamous example. For this reason, CSO is now used mainly only in shots where scale is unimportant. A good example of such a usage is in the broadcasting of weather forecasts. A map will be placed in the same shot as a weather forecaster via CSO, largely because viewers aren't expecting that a weather broadcaster's hand will have the same scale as a map."

source: http://tardis.wikia.com/wiki/CSO

"Visual effects saw in the script for Invasion of the Dinosaurs the requirement for a twenty foot monster which they then made in the customary way. They produced a man-sized dinosaur suit which was so heavy when worn that in order to support the weight of it, the head had a ring bolt through it, fixed to a line attached to the ceiling. Therefore the actor inside could only move within the narrow radius allowed by the length of the line. After that, they used CSO (Colour Separation Overlay) to make the creature seem large beside Jon Pertwee, and it wasn’t until a while afterwards that it dawned on everyone that there was no need to have gone to all the time and expensive of building a full-sized suit. They could have achieved exactly the same effect using a puppet two feet tall, operated by rods."

source: https://them0vieblog.com/2013/03/26/doctor-who-invasion-of-the-dinosaurs-review/

The Secret: Green Puppeteers

google green.jpg

image source: GOOGLE search

Cooking in space whole red rice and turmeric chicken.gif

image source: Cooking in space: whole red rice and turmeric chicken •  European Space Agency, ESA  •   https://www.youtube.com/watch?v=4exaXdPKS3Y

Food Falls Clip

FYI: Screens For Chroma Keys Need Not Be Colored Green

"As the name implies, it involves separating the colour of the background from one shot then overlaying that now-"background-less" shot onto another one. Usually the colour of the background will be green, blue or yellow, because these colours are not natural human skin tones. The mixer then "keys out" that colour, replacing it with the image of another shot. While this "greenscreen" process is a common building block to other special effects processes, like CGI, what distinguishes CSO is the fact that the two shots are mixed together and then recorded live. Any errors of alignment, scale or colour separation are committed to the shot."

source: http://tardis.wikia.com/wiki/CSO

original video:

source: Cooking in space: whole red rice and turmeric chicken •  European Space Agency, ESA  •   https://www.youtube.com/watch?v=4exaXdPKS3Y

alternative link: https://youtu.be/x5m7UYtp7hE

1984: NASA Fakery

source: Satellite Recovery From Orbit: Space Shuttle STS-51-A Highlights 1984 NASA 14th Shuttle Mission  •  Jeff Quitney  •  https://www.youtube.com/watch?v=HxW0ocPdWok

How To Fake "Outer Space" and "Zero-Gravity":

 Wires That Are Invisible To Camera "Eye" or CGI?

Stage magic deus ex machina illusions would seem to be as old as civilization itself.

"Deus ex machina (Latin: [ˈdeʊs ɛks ˈmaː.kʰɪ.naː]: /ˈdeɪ.əs ɛks ˈmɑːkiːnə/ or /ˈdiːəs ɛks ˈmækɪnə/; plural: dei ex machina) is a Latin calque from Greek ἀπὸ μηχανῆς θεός (apò mēkhanês theós), meaning 'god from the machine'. The term has evolved to mean a plot device whereby a seemingly unsolvable problem is suddenly and abruptly resolved by the inspired and unexpected intervention of some new event, character, ability, or object. Its function can be to resolve an otherwise irresolvable plot situation, to surprise the audience, to bring the tale to a happy ending, or act as a comedic device."

"The term was coined from the conventions of Greek tragedy, where a machine is used to bring actors playing gods onto the stage. The machine could be either a crane (mechane) used to lower actors from above or a riser that brought actors up through a trapdoor. Preparation to pick up the actors was done behind the skene. The idea was introduced by Aeschylus and was used often to resolve the conflict and conclude the drama. Although the device is associated mostly with Greek tragedy, it also appeared in comedies."

source: https://en.wikipedia.org/wiki/Deus_ex_machina

The Secret: An Invisible Green Screen Puppeteer Helper

Screen Shot 2018-01-31 at 2.29.50 PM (

image source: International Space Fakery and http://chromagreen.ca

NASA 1984: Early CGI or Traditional Special Effects Wizardry?

NASA, the aerospace industry, and the Military (GOVERNMENT) have always been involved with and connected to Hollywood. NASA and company have always funded research into visual effects, communications and computer technology, and augmented virtual reality gear. CGI has always been important to the aerospace, automotive and advertising industries. CAD programs have been used to design automotive and aerospace vehicles for decades, for example. The government has always used tax money for funding Hollywood style, special effect ventures of one type or another. Works of art used for constructing social roles and ego identities and works of art used as society building relics, have always been commissioned by those with the financial resource to do so. The very wealthy have always patronized the arts. The works of art produced as a result of this fact are, more often than not, used for propaganda purposes.

Works of art act as socially manipulative constructs that reinforce various governmental lies and illusions required by the few to keep the many toiling away at the wheel of industry for all time. In the case of the "Outer Space" programs of the World, what we have is a long standing propaganda con job designed to promote both the lie of "Outer Space" exploration and colonization and the lie of ballistic nuclear devastation we are all supposed to live in fear of. Huge rockets are supposed to be the medium for both travel to cosmic promised lands and the medium for World wide nuclear devastation. Huge rockets are supposed to be symbols of both hope and doom.

The World has long been run as one financial enterprise. Nation state borders are as fake as the Atomic Age Cold War Space Race that never seems to end.

We are supposed to believe in all sorts of Governmentally promoted Platonic Shadow Cave Phantom Menace threats. We are supposed to truly believe our taxes go for our collective defense and we are not supposed to even suspect that the money is being spent on visual special effect propaganda that only benefits needless Government.

NASA 1984

Early CGI or wire work?

Satellite Recovery From Orbit Space Shuttle STS-51-A Highlights 1984 NASA 14th Shuttle Mission.gif

We have been indoctrinated to believe what we see on screens.

This is a huge mistake. Film and video are claims not evidence. Photo fakery is as old as photography itself. 

image source: Satellite Recovery From Orbit: Space Shuttle STS-51-A Highlights 1984 NASA 14th Shuttle Mission  •  Jeff Quitney  •  https://www.youtube.com/watch?v=HxW0ocPdWok

1984: Floating M&M's  •  1996: Animated Character M&M's

mms-faint-2-video-hed (

"As you might recall, Santa Claus faints dead away when he unexpectedly encounters two colorful M&M’s spokescandies in BBDO’s classic TV spot from 1996. The Red M&M faints, too. All of this leaves Yellow with a concerned look on his face."

image and quote source: http://www.adweek.com/creativity/21-years-later-mms-unwraps-a-sequel-to-its-classic-christmas-ad/

NASA, JPL & CGI: 1977

"Bob Holzman of NASA's Jet Propulsion Laboratory in California established JPL's Computer Graphics Lab in 1977 as a group with technology expertise in visualizing data being returned from NASA missions. On the advice of Ivan Sutherland, Holzman hired a graduate student from Utah named Jim Blinn. Blinn had worked with imaging techniques at Utah, and developed them into a system for NASA's visualization tasks. He produced a series of widely seen "fly-by" simulations, including the Voyager, Pioneer and Galileo spacecraft fly-bys of Jupiter, Saturn and their moons. He also worked with Carl Sagan, creating animations for his Cosmos: A Personal Voyage TV series. Blinn developed many influential new modelling techniques, and wrote papers on them for the IEEE (Institute of Electrical and Electronics Engineers), in their journal Computer Graphics and Applications. Some of these included environment mapping, improved highlight modelling, "blobby" modelling, simulation of wrinkled surfaces, and simulation of butts and dusty surfaces. Later in the 80s, Blinn developed CG animations for an Annenberg/CPB TV series, The Mechanical Universe, which consisted of over 500 scenes for 52 half-hour programs describing physics and mathematics concepts for college students. This he followed with production of another series devoted to mathematical concepts, called Project Mathematics!."

source: https://en.wikipedia.org/wiki/History_of_computer_animation#Towards_3D:_mid-1970s_into_the_1980s


Green Screen Puppet Master Fakery


image source: "They're called 'Green Screen Fluffers' - " • https://imgur.com/gallery/qm5O6pa


Modern Special Effects, Wires and Flight Harnesses, Computer-Motion Controlled Cameras and Computer "Magic"

Creating the illusion of "Outer Space" is easier today than ever before

3c12d2b0-9d83-11e3-9aa6-cbbcf0c29a92_gravity_fire_gif (

image source: https://www.yahoo.com/entertainment/blogs/movie-news/gravity-30-minute-special-shows-sandra-bullock-became-185633797.html

The 2013 Film "Gravity"

"For some scenes, the actors were filmed as they swam through their moves underwater. For others, they were hooked up with a 12-wire suspension system, and then filmed with robotic cameras while puppeteers pulled their strings. (The harness for the wires had to be made just right to fit under Sandra Bullock's skivvies.) Still other scenes were shot while the actors perched on a variety of rigs set up on a turntable. "

Film Synopsis:

"Two astronauts work together to survive after an accident which leaves them stranded in space."

source: https://www.nbcnews.com/science/how-gravity-threw-sandra-bullock-zero-gravity-big-screen-8c11326787

source: http://www.imdb.com/title/tt1454468/

maxresdefault (

Hollywood creates the illusion of astronauts doing things like floating through hatches with old school wire and harness, motion control cameras and new school CGI magic.

Even without CGI, old school darkroom fakery techniques allow for the crafting of very believable imagery. There are many ways to fool the camera eye.

Today's electronic technology makes doing the job that much easier.

image source: https://www.youtube.com/watch?v=QxHc8Ns5g1c


"There were some very real, very physical things that made Gravity possible, like that two-ton camera rig and wires that allowed Bullock to be manipulated like a marionette-- we'll get into those later. But maybe the most miraculous thing about the movie is how it uses CGI in ways that makes even jaded moviegoers marvel at the power of computers. "We knew that the only way to achieve this was digitally," Heyman tells us. "It wasn’t even a question." Some scenes, like the bravura "single-take shot" that opens the film, are almost entirely CGI, with the faces of the actors the only elements that weren't created digitally. Others, like scenes where Bullock is strapped in inside a ship, had tiny touches of CGI, "things like belts floating and the like." Even in the film's final scene, where Bullock crash lands on Earth into a miraculously verdant and Earth-like atmosphere, "we did an awful lot of digital work to green that up, so it didn’t look like an alien landscape." Yes, even the single scene of the movie set here on Earth required digital manipulation. For many actors who are in CGI-heavy films, the end result can seem alien, as if their work is just a speck inside a story they don't even recognize. But as Bullock told Hitfix, the huge amount of CGI allowed her to experience the movie the way we did. "I had never seen George's side. I'd never seen the stars, the space shuttle. I didn't know what the music was. I didn't know what I looked like in the suit. I didn't know what we were making or what that looked and felt like. And, I felt like, all of a sudden, this was a gigantic organism that had been created out of technology. And to see it in that beautiful 3D where it was used so sparingly and emotionally? I don't know how to describe it." "

source: https://www.cinemablend.com/new/How-Did-Gravity-Do-Secrets-Behind-Its-Groundbreaking-Special-Effects-39790.html?story_page=2

source: Interstellar - The Simulation of Zero G (Bonus Feature)  • Guitarfollower22 •  https://www.youtube.com/watch?v=YEt2cClL6Wk

The Lightbox

"To capture Bullock and Clooney's faces and bodies in the right light and from the right camera angles, the production team constructed what's probably their biggest innovation: The Lightbox. A 9x9x9 foot cube outfitted with 4,096 LED bulbs, the Lightbox could mimic the light effects from any given scene to allow shadows to play across Bullock's face. It also contained a rig to hold her in place and allow the camera-- often on that two-ton beast that Heyman mentioned, like the ones used to build cars-- to move entirely around her. "It would race up and down that track and the camera was on the end of that robot and around the head it was able to go 360." Though Bullock's character spends much of the film spinning in space, she could never actually go upside down; as Heyman explained to Film.com, "The camera was doing the movement, because, if she goes upside down, then you can see her face and her body straining, and the whole thing with zero G, there is no strain." "

source: https://www.cinemablend.com/new/How-Did-Gravity-Do-Secrets-Behind-Its-Groundbreaking-Special-Effects-39790.html?story_page=2


"Cuaron and Heyman imported the puppeteers behind the smash hit Broadway and West End productions of War Horse to achieve the scenes inside the space stations, where Bullock appears to be floating weightlessly from room to room. Movies like Apollo 13 achieved this effect with the "Vomit Comet", a NASA-outfitted plane that could send actors into free fall for 25 seconds, but Cuaron wanted longer takes, so he and the puppeteers came up with a system. Bullock was suspended from 12 carbon-thin wires, practically invisible onscreen, that were then manipulated by the puppeteers. Bullock explained the process in detail to Hitfix: "Long story short, they had all these cameras taking pictures of my body. Created almost like a Xerox copying thing, a carbon fiber copy of my chest plate. Just breasts, rib cage that seamlessly placed over my body. I could put my clothes over it. It then had little hooks in it and I would lay down on a table, it was like an operating table, where all these technicians would come in and they would hook me up to the wires and strap my legs [into them]. And then they would lift me up and the puppeteers from War Horse were in the back puppeteering me. And then I had to move the body, do the swing and do everything myself while they would push me in one direction where zero g would push you, not where earth would push you. Everything was designed so I then had to react the way zero g is. It was the closest thing to feeling like I was flying making this movie, the 12-wire. It was painful, but it was so cool."

"Why were the long takes worth it? Heyman explains, "Part of the immersive quality, I think comes from the lack of cuts. It’s in a way reminiscent of space footage that we grew up on in the ‘60s and ‘70s and even later, you know, when that was very much a part of our lives." "

source: https://www.cinemablend.com/new/How-Did-Gravity-Do-Secrets-Behind-Its-Groundbreaking-Special-Effects-39790.html?story_page=2


It's Always Sunny In Philadelphia's: Green Man

The secret "key" to a really great "Zero Gravity" production is the invisible assistant.

image source: http://knowyourmeme.com/memes/green-man


History of The Chroma Key

"Mr Vlahos was not the first to use a blue-screens - earlier versions of the technique can be seen in films including The Thief of Bagdad, and The Ten Commandments.

But he is credited with developing a way to use it that minimised some objects appearing to have a strange looking glow as a side-effect. He called his invention the colour-difference travelling matte scheme. Like pre-existing blue-screen techniques it involves filming a scene against an aquamarine blue-coloured background. This is used to generate a matte - which is transparent wherever the blue-colour features on the original film, and opaque elsewhere. This can then be used to superimpose a separately filmed scene or visual effects to create a composite. Mr Vlahos's breakthrough was to create a complicated laboratory process which involved separating the blue, green and red parts of each frame before combining them back together in a certain order. He also noted in a patent filing that the process allowed the blue-screen procedure to cope with glassware, cigarette smoke, blowing hair and motion blur which had all caused problems for earlier efforts. Movie studio MGM had commissioned him to invent it. Mr Vlahos later noted that it had taken him six months of thought to come up with the idea, much of it spent staring out onto Hollywood Boulevard.

"He later created a "black box" - which he called Ultimatte - to handle the process, first for film and then electronically for video. Mr Vlahos was also awarded a patent for his work on a related technique called sodium vapour illumination, which he developed for Disney. This involved filming the actors' scenes against a white backdrop using sodium-powered lamps which caused a yellow glow to bounce off the background. The camera featured two film stocks shot simultaneously, and a prism on its lens. The prism split the yellow sodium light away from the other colours, sending it to a black-and-white-based film stock which was then used to create the matte. Meanwhile, the other film stock recorded the scenes in colour without the sodium's yellow cast being visible. The advantage was that this created an even cleaner effect than Mr Vlahos' original blue-screen efforts."

"Disney used Mr Vlahos's version of the technique to make Mary Poppins, Bedknobs and Broomsticks, and Pete's Dragon - among other movies - letting its actors appear to interact with cartoons. Alfred Hitchcock also borrowed the technique for The Birds, and Warren Beatty later used it in Dick Tracy."

"Ultimatte now offers a software plug-in for Avid and Apple's Final Cut editing programs However, it has since fallen out of favour because the equipment involved is more expensive and cumbersome to operate, and the quality of blue- and green-screen techniques has improved."

image and quote source: http://www.bbc.com/news/technology-21463817

Real Time "Augmented" Virtual Reality Technology

source: The fake NASA ISS interior - a technical breakdown by Mike Helmick  • markksargent •  https://www.youtube.com/watch?v=uJhL7y0ahUE

I don't think the Earth is best described as flat, but this guy is on the right track.

DARPA: How Our Tax Money Is Spent On Computer Tech For Video Games

NASA and the Military have been funding research and development of CGI video game technology and virtual reality, for decades.

"The Defense Advanced Research Projects Agency (DARPA) is an agency of the United States Department of Defenseresponsible for the development of emerging technologies for use by the military. Originally known as the Advanced Research Projects Agency (ARPA), the agency was created in February 1958 by President Dwight D. Eisenhower in response to the Soviet launching of Sputnik 1 in 1957. Since its inception, the agency's mission is ensuring that the United States avoids further technological surprise.[3] By collaborating with academic, industry, and government partners, DARPA formulates and executes research and development projects to expand the frontiers of technology and science, often beyond immediate U.S. military requirements. DARPA-funded projects have provided significant technologies that influenced many non-military fields, such as computer networking and the basis for the modern Internet, and graphical user interfaces in information technology. DARPA is independent of other military research and development and reports directly to senior Department of Defense management. DARPA has about 240 employees, of whom approximately 15 are in management, and close to 140 are technical staff."

source: https://en.wikipedia.org/wiki/DARPA


image source: NASA and  "SAMSUNG GEAR VR - MARS IS A REAL PLACE - Satellite imagery and the Mars Rover in Virtual Reality"

1985: NASA's Virtual Visual Environment Display (VIVED)

"During the '80s and '90s, the public was gripped by VR fever. Computer scientist, writer and former Atari researcher Jaron Lanier popularized the term "virtual reality" (VR) to describe the immersion of one's body and mind in an artificial, three-dimensional space, and a variety of products hit the consumer market aimed at connecting people with this new digital environment. But the federally supported R&D sector was where significant investments were made in practical applications for the technology. NASA's Ames Research Center played host to a VR research project launched by Michael McGreevy in 1985 and within a year it was ready to show off a working prototype of its Virtual Visual Environment Display (VIVED) helmet at CES."

"Key developments in simulated environments can be traced back to 1957, when cinematographer Morton Heilig invented the "Sensorama" booth and "Telesphere Mask," where video, sound, vibration and wind were used to replicate a real-world experience. In 1968, Ivan Sutherland went one step further when he built a head-mounted device using mini CRT displays to produce an immersive graphical simulation. It adjusted the user's view inside a 3D environment according to their head movements in the real world."

image and quote source: https://www.engadget.com/2013/12/15/time-machines/


1989: 20 Years After Apollo 11 and NASA is interested in (military grade) virtual reality research.

source:Applied Virtual Reality Research And Applications at NASA/Marshall ...  •  https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/19970006956.pdf

source: NASA Ames VIEWlab VR demo reel 1989  •  Scott Fisher  • https://www.youtube.com/watch?v=3L0N7CKvOBA

360° VR IS HERE!

360 degree virtual reality is here. It is only a matter of time before virtual and augmented reality becomes even ubiquitous.

Screen Shot 2018-01-31 at 4.16.58 PM.jpg

NASA Virtual Reality Research and Development: Simulating "Outer Space" Here On Earth

source: NASA's Virtual Reality Lab For Astronaut Training  •  NVIDIA  •  https://www.youtube.com/watch?v=cwK3MatOQFc




source: Ross End-to-End Virtual Productions Workflow Demonstration  •  Ross Video Limited  •  https://www.youtube.com/watch?v=C4PwEygbs1A




source: The History Of Special Effects - part 1  •  William Forsche  •  https://www.youtube.com/watch?v=DGfSz8d-Z6I&t=25s


source: Miniatures in Space Interstellar Making  • Himanshu Baswal •  https://www.youtube.com/watch?v=amJY_LDoh88



source: Ultimate Virtual Set production / Griffith Park  •  Galaxy Hunter  •  https://www.youtube.com/watch?v=anHHZs_aBFo

You Will Only Be Able To Interact With "Outer Space" Artifacts By Virtual and Augmented Means

Huge rockets and the place called "Outer Space" are fake. These Cold War era relics are designed to inspire awe and terror. Huge rockets are supposed to be symbols of the promised land of space exploration and the medium for ballistic and nuclear style annihilation. Luckily, all the evidence we can examine and all real world demonstrable science indicates we have all been exposed to nothing but a decades old, international, propaganda campaign designed to keep us all believing in bogeyman, phantom menace style threats so we continue paying taxes for governmental protections we do not need.


"Outer Space" is the best Platonic Shadow Cave theater screen yet. It can only exist as art of some kind. "Outer Space" is an artifact of a modern secular based, religious faith.

see also: https://en.wikipedia.org/wiki/Relationship_between_religion_and_science 

source: ARiane - Rocket launch in Augmented Reality  • Guillaume Ktz •  https://www.youtube.com/watch?v=vgjY8P2WkgM


Outer Space Is Fake: How To Create The "Zero-Gravity" Illusion

It is easy to fake "Outer Space".

From Georges Méliès 1902 "A Trip To The Moon" to Fritz Lang's "Woman In The Moon", to today's feats of international space fakery, outer space was always an imaginary place that could only exist as art. Outer space can only exist on a screen of some kind. Outer space is a myth and an idea. It is not a real place any of us can go to. Outer Space is the result of Hollywood film magic illusion; it is not the result of real demonstrable scientific principle, despite all world-wide governmental and university hype and all the decades of international governmental marketing public relations propaganda to the contrary. 

Outer Space is an artifact of a state sponsored secular religious faith that many of us are unconscious adherents of.

Article link: http://www.aamorris.net/properganderatpropaganda/2018/1/8/-outer-space-is-fake-how-to-create-the-zero-gravity-illusion

source: Training Superman Flight 'Superman Returns' Featurette  • Flashback FilmMaking  •  https://www.youtube.com/watch?v=qbNm789GzxE



The Power of CGI

What our taxes really pay for. Research and development of computer graphics technology.

"Early 3D animation in the cinema"

"The first use of 3D wireframe imagery in mainstream cinema was in the sequel to Westworld, Futureworld (1976), directed by Richard T. Heffron. This featured a computer-generated hand and face created by then University of Utah graduate students Edwin Catmull and Fred Parke which had initially appeared in their 1972 experimental short A Computer Animated Hand.[43] The Oscar-winning 1975 short animated film Great, about the life of the Victorian engineer Isambard Kingdom Brunel, contains a brief sequence of a rotating wireframe model of Brunel's final project, the iron steam ship SS Great Eastern.The third movie to use this technology was Star Wars (1977), written and directed by George Lucas, with wireframe imagery in the scenes with the Death Star plans, the targeting computers in the X-wing fighters, and the Millennium Falcon spacecraft. The Walt Disney film The Black Hole (1979, directed by Gary Nelson) used wireframe rendering to depict the titular black hole, using equipment from Disney's engineers. In the same year, the science-fiction horror film Alien, directed by Ridley Scott, also used wireframe model graphics, in this case to render the navigation monitors in the spaceship. The footage was produced by Colin Emmett at the Atlas Computer Laboratory."

Nelson Max

"Although Lawrence Livermore Labs in California is mainly known as a centre for high-level research in science, it continued producing significant advances in computer animation throughout this period. Notably, Nelson Max, who joined the Lab in 1971, and whose 1977 film Turning a sphere inside out is regarded as one of the classic early films in the medium (International Film Bureau, Chicago, 1977). He also produced a series of "realistic-looking" molecular model animations that served to demonstrate the future role of CGI in scientific visualization ("CGI" = Computer-generated imagery). His research interests focused on realism in nature images, molecular graphics, computer animation, and 3D scientific visualization. He later served as computer graphics director for the Fujitsu pavilions at Expo 85 and 90 in Japan."


"In 1974, Alex Schure, a wealthy New York entrepreneur, established the Computer Graphics Laboratory (CGL) at the New York Institute of Technology (NYIT). He put together the most sophisticated studio of the time, with state of the art computers, film and graphic equipment, and hired top technology experts and artists to run it -- Ed Catmull, Malcolm Blanchard, Fred Parke and others all from Utah, plus others from around the country including Ralph Guggenheim, Alvy Ray Smith and Ed Emshwiller. During the late 70s, the staff made numerous innovative contributions to image rendering techniques, and produced many influential software, including the animation program Tween, the paint program Paint, and the animation program SoftCel. Several videos from NYIT become quite famous: Sunstone, by Ed Emshwiller, Inside a Quark, by Ned Greene, and The Works. The latter, written by Lance Williams, was begun in 1978, and was intended to be the first full-length CGI film, but it was never completed, though a trailer for it was shown at SIGGRAPH 1982. In these years, many people regarded NYIT CG Lab as the top computer animation research and development group in the world."

"The quality of NYIT's work attracted the attention of George Lucas, who was interested in developing a CGI special effects facility at his company Lucasfilm. In 1979, he recruited the top talent from NYIT, including Catmull, Smith and Guggenheim to start his division, which later spun off as Pixar, founded in 1986 with funding by Apple Inc. co-founder Steve Jobs."


"The framebuffer or framestore is a graphics screen configured with a memory buffer that contains data for a complete screen image. Typically, it is a rectangular array (raster) of pixels, and the number of pixels in the width and the height is its "resolution". Color values stored in the pixels can be from 1-bit (monochrome), to 24-bit (true color, 8-bits each for RGB—Red, Green, & Blue), or also 32-bit, with an extra 8-bits used as a transparency mask (alpha channel). Before the framebuffer, graphics displays were all vector-based, tracing straight lines from one co-ordinate to another. The first known example of a framebuffer was built in 1969 at Bell Labs, where Joan Miller implemented a simple paint program to allow users to "paint" direct on the framebuffer. This device had just 3-bits (giving just 8 colors). In 1972–73, Richard Shoup developed the SuperPaint system at Xerox PARC, which used a framebuffer displaying 640×480 pixels (standard NTSC video resolution) with eight-bit depth (256 colors). The SuperPaint software contained all the essential elements of later paint packages, allowing the user to paint and modify pixels, using a palette of tools and effects, and thereby making it the first complete computer hardware and software solution for painting and editing images. Shoup also experimented with modifying the output signal using color tables, to allow the system to produce a wider variety of colors than the limited 8-bit range it contained. This scheme would later become commonplace in computer framebuffers. The SuperPaint framebuffer could also be used to capture input images from video."

"The first commercial framebuffer was produced in 1974 by Evans & Sutherland. It cost about $15,000, with a resolution of 512 by 512 pixels in 8-bit grayscale color, and sold well to graphics researchers without the resources to build their own framebuffer.[54] A little later, NYIT created the first full-color 24-bit RGB framebuffer by using three of the Evans & Sutherland framebuffers linked together as one device by a minicomputer. Many of the "firsts" that happened at NYIT were based on the development of this first raster graphics system. In 1975, the UK company Quantel, founded in 1973 by Peter Michael,[55] produced the first commercial full-color broadcast framebuffer, the Quantel DFS 3000. It was first used in TV coverage of the 1976 Montreal Olympics to generate a picture-in-picture inset of the Olympic flaming torch while the rest of the picture featured the runner entering the stadium. Framebuffer technology provided the cornerstone for the future development of digital television products."

"By the late 70s, it became possible for personal computers (such as the Apple II) to contain low-color framebuffers. However, it was not until the 1980s that a real revolution in the field was seen, and framebuffers capable of holding a standard video image were incorporated into standalone workstations. By the 90s, framebuffers eventually became the standard for all personal computers."


"At this time, a major step forward to the goal of increased realism in 3D animation came with the development of "fractals". The term was coined in 1975 by mathematician Benoit Mandelbrot, who used it to extend the theoretical concept of fractional dimensions to geometric patterns in nature, and published in English translation of his book Fractals: Form, Chance and Dimension in 1977. In 1979–80, the first film using fractals to generate the graphics was made by Loren Carpenter of Boeing. Titled Vol Libre, it showed a flight over a fractal landscape, and was presented at SIGGRAPH 1980.[59] Carpenter was subsequently hired by Pixar to create the fractal planet in the Genesis Effect sequence of Star Trek II: The Wrath of Khan in June 1982."

JPL and Jim Blinn

"Bob Holzman of NASA's Jet Propulsion Laboratory in California established JPL's Computer Graphics Lab in 1977 as a group with technology expertise in visualizing data being returned from NASA missions. On the advice of Ivan Sutherland, Holzman hired a graduate student from Utah named Jim Blinn.[61][62] Blinn had worked with imaging techniques at Utah, and developed them into a system for NASA's visualization tasks. He produced a series of widely seen "fly-by" simulations, including the Voyager, Pioneer and Galileo spacecraft fly-bys of Jupiter, Saturn and their moons. He also worked with Carl Sagan, creating animations for his Cosmos: A Personal Voyage TV series. Blinn developed many influential new modelling techniques, and wrote papers on them for the IEEE (Institute of Electrical and Electronics Engineers), in their journal Computer Graphics and Applications. Some of these included environment mapping, improved highlight modelling, "blobby" modelling, simulation of wrinkled surfaces, and simulation of butts and dusty surfaces. Later in the 80s, Blinn developed CG animations for an Annenberg/CPB TV series, The Mechanical Universe, which consisted of over 500 scenes for 52 half-hour programs describing physics and mathematics concepts for college students. This he followed with production of another series devoted to mathematical concepts, called Project Mathematics!."

Motion control photography

"Motion control photography is a technique that uses a computer to record (or specify) the exact motion of a film camera during a shot, so that the motion can be precisely duplicated again, or alternatively on another computer, and combined with the movement of other sources, such as CGI elements. Early forms of motion control go back to John Whitney's 1968 work on 2001: A Space Odyssey, and the effects on the 1977 movie Star Wars Episode IV: A New Hope, by George Lucas' newly created company Industrial Light & Magic in California (ILM). ILM created a digitally controlled camera known as the Dykstraflex, which performed complex and repeatable motions around stationary spaceship models, enabling separately filmed elements (spaceships, backgrounds, etc.) to be coordinated more accurately with one another. However, neither of these was actually computer-based—Dykstraflex was essentially a custom-built hard-wired collection of knobs and switches. The first commercial computer-based motion control and CGI system was developed in 1981 in the UK by Moving Picture Company designer Bill Mather."

3D computer graphics software

"3D computer graphics software began appearing for home computers in the late 1970s. The earliest known example is 3D Art Graphics, a set of 3D computer graphicseffects, written by Kazumasa Mitazawa and released in June 1978 for the Apple II."

The 1980s

"The '80s saw a great expansion of radical new developments in commercial hardware, especially the incorporation of framebuffer technologies into graphic workstations, allied with continuing advances in computer power and affordability."

Silicon Graphics, Inc (SGI)

"Silicon Graphics, Inc (SGI) was a manufacturer of high-performance computer hardware and software, founded in 1981 by Jim Clark. His idea, called the Geometry Engine, was to create a series of components in a VLSI processor that would accomplish the main operations required in image synthesis—the matrix transforms, clipping, and the scaling operations that provided the transformation to view space. Clark attempted to shop his design around to computer companies, and finding no takers, he and colleagues at Stanford University, California, started their own company, Silicon Graphics. SGI's first product (1984) was the IRIS (Integrated Raster Imaging System). It used the 8 MHz M68000 processor with up to 2 MB memory, a custom 1024×1024 frame buffer, and the Geometry Engine to give the workstation its impressive image generation power. Its initial market was 3D graphics display terminals, but SGI's products, strategies and market positions evolved significantly over time, and for many years were a favoured choice for CGI companies in film, TV, and other fields."


"In 1981, Quantel released the "Paintbox", the first broadcast-quality turnkey system designed for creation and composition of television video and graphics. Its design emphasized the studio workflow efficiency required for live news production. Essentially, it was a framebuffer packaged with innovative user software, and it rapidly found applications in news, weather, station promos, commercials, and the like. Although it was essentially a design tool for still images, it was also sometimes used for frame-by-frame animations. Following its initial launch, it revolutionised the production of television graphics, and some Paintboxes are still in use today due to their image quality, and versatility. This was followed in 1982 by the Quantel Mirage, or DVM8000/1 "Digital Video Manipulator", a digital real-time video effects processor. This was based on Quantel's own hardware, plus a Hewlett-Packard computer for custom program effects. It was capable of warping a live video stream by texture mapping it onto an arbitrary three-dimensional shape, around which the viewer could freely rotate or zoom in real-time. It could also interpolate, or morph, between two different shapes. It was considered the first real-time 3D video effects processor, and the progenitor of subsequent DVE (Digital video effect) machines. In 1985, Quantel went on to produce "Harry", the first all-digital non-linear editing and effects compositing system."

Osaka University

"In 1982, Japan's Osaka University developed the LINKS-1 Computer Graphics System, a supercomputer that used up to 257 Zilog Z8001 microprocessors, used for rendering realistic 3D computer graphics. According to the Information Processing Society of Japan: "The core of 3D image rendering is calculating the luminance of each pixel making up a rendered surface from the given viewpoint, light source, and object position. The LINKS-1 system was developed to realize an image rendering methodology in which each pixel could be parallel processed independently using ray tracing. By developing a new software methodology specifically for high-speed image rendering, LINKS-1 was able to rapidly render highly realistic images." It was "used to create the world's first 3D planetarium-like video of the entire heavens that was made completely with computer graphics. The video was presented at the Fujitsu pavilion at the 1985 International Exposition in Tsukuba." The LINKS-1 was the world's most powerful computer, as of 1984."

3D Fictional Animated Films at the University of Montreal

"In the '80s, University of Montreal was at the front run of Computer Animation with three successful short 3D animated films with 3D characters: In 1983, Philippe Bergeron, Nadia Magnenat Thalmann, and Daniel Thalmann directed Dream Flight, considered as the first 3D generated film telling a story. The film was completely programmed using the MIRA graphical language,[74] an extension of the Pascal programming language based on abstract graphical data types.[75] The film got several awards and was shown at the SIGGRAPH '83 Film Show. In 1985, Pierre Lachapelle, Philippe Bergeron, Pierre Robidoux and Daniel Langlois directed Tony de Peltrie, which shows the first animated human character to express emotion through facial expressions and body movements, which touched the feelings of the audience. Tony de Peltrie premiered as the closing film of SIGGRAPH'85. In 1987, the Engineering Institute of Canada celebrated its 100th anniversary. A major event, sponsored by Bell Canada and Northern Telecom (now Nortel), was planned for the Place des Arts in Montreal. For this event, Nadia Magnenat Thalmann and Daniel Thalmann simulated Marilyn Monroe and Humphrey Bogart meeting in a cafe in the old town section of Montreal. The short movie, called Rendez-vous in Montreal  was shown in numerous festivals and TV channels all over the world."

Sun Microsystems, Inc

"The Sun Microsystems company was founded in 1982 by Andy Bechtolsheim with other fellow graduate students at Stanford University. Bechtolsheim originally designed the SUN computer as a personal CAD workstation for the Stanford University Network (hence the acronym "SUN"). It was designed around the Motorola 68000 processor with the Unix operating system and virtual memory, and, like SGI, had an embedded frame buffer.[79] Later developments included computer servers and workstations built on its own RISC-based processor architecture and a suite of software products such as the Solaris operating system, and the Java platform. By the '90s, Sun workstations were popular for rendering in 3D CGI filmmaking—for example, Disney-Pixar's 1995 movie Toy Story used a render farm of 117 Sun workstations.[80] Sun was a proponent of open systems in general and Unix in particular, and a major contributor to open source software."

National Film Board of Canada

"The NFB's French-language animation studio founded its Centre d'animatique in 1980, at a cost of $1 million CAD, with a team of six computer graphics specialists. The unit was initially tasked with creating stereoscopic CGI sequences for the NFB's 3-D IMAX film Transitions for Expo 86. Staff at the Centre d'animatique included Daniel Langlois, who left in 1986 to form Softimage."

First turnkey broadcast animation system

"Also in 1982, the first complete turnkey system designed specifically for creating broadcast-standard animation was produced by the Japanese company Nippon Univac Kaisha ("NUK", later merged with Burroughs), and incorporated the Antics 2-D computer animation software developed by Alan Kitching from his earlier versions. The configuration was based on the VAX 11/780 computer, linked to a Bosch 1-inch VTR, via NUK's own framebuffer. This framebuffer also showed realtime instant replays of animated vector sequences ("line test"), though finished full-color recording would take many seconds per frame.[84][85][86] The full system was successfully sold to broadcasters and animation production companies across Japan. Later in the '80s, Kitching developed versions of Antics for SGI and Apple Mac platforms, and these achieved a wider global distribution."

First solid 3D CGI in the movies

"The first cinema feature movie to make extensive use of solid 3D CGI was Walt Disney's Tron, directed by Steven Lisberger, in 1982. The film is celebrated as a milestone in the industry, though less than twenty minutes of this animation were actually used—mainly the scenes that show digital "terrain", or include vehicles such as Light Cycles, tanks and ships. To create the CGI scenes, Disney turned to the four leading computer graphics firms of the day: Information International Inc, Robert Abel and Associates (both in California), MAGI, and Digital Effects (both in New York). Each worked on a separate aspect of the movie, without any particular collaboration.[88]Tron was a box office success, grossing $33 million on a budget of $17 million."

"In 1984, Tron was followed by The Last Starfighter, a Universal Pictures / Lorimar production, directed by Nick Castle, and was one of cinema's earliest films to use extensive CGI to depict its many starships, environments and battle scenes. This was a great step forward compared with other films of the day, such as Return of the Jedi, which still used conventional physical models.[90] The computer graphics for the film were designed by artist Ron Cobb, and rendered by Digital Productions on a Cray X-MP supercomputer. A total of 27 minutes of finished CGI footage was produced—considered an enormous quantity at the time. The company estimated that using computer animation required only half the time, and one half to one third the cost of traditional special effects.[91] The movie was a financial success, earning over $28 million on an estimated budget of $15 million."

Inbetweening and morphing

"The terms inbetweening and morphing are often used interchangeably, and signify the creating of a sequence of images where one image transforms gradually into another image smoothly by small steps. Graphically, an early example would be Charles Philipon's famous 1831 caricature of French King Louis Philippe turning into a pear (metamorphosis).[93] "Inbetweening" (AKA "tweening") is a term specifically coined for traditional animation technique, an early example being in E.G.Lutz's 1920 book Animated Cartoons.[94] In computer animation, inbetweening was used from the beginning (e.g., John Whitney in the '50s, Charles Csuri and Masao Komura in the '60s).[23] These pioneering examples were vector-based, comprising only outline drawings (as was also usual in conventional animation technique), and would often be described mathematically as "interpolation". Inbetweening with solid-filled colors appeared in the early '70s, (e.g., Alan Kitching's Antics at Atlas Lab, 1973,[36] and Peter Foldes' La Faim at NFBC, 1974[31]), but these were still entirely vector-based."

"The term "morphing" did not become current until the late '80s, when it specifically applied to computer inbetweening with photographic images—for example, to make one face transform smoothly into another. The technique uses grids (or "meshes") overlaid on the images, to delineate the shape of key features (eyes, nose, mouth, etc.). Morphing then inbetweens one mesh to the next, and uses the resulting mesh to distort the image and simultaneously dissolve one to another, thereby preserving a coherent internal structure throughout. Thus, several different digital techniques come together in morphing.[95] Computer distortion of photographic images was first done by NASA, in the mid-1960s, to align Landsat and Skylab satellite images with each other. Texture mapping, which applies a photographic image to a 3D surface in another image, was first defined by Jim Blinn and Martin Newell in 1976. A 1980 paper by Ed Catmull and Alvy Ray Smith on geometric transformations, introduced a mesh-warping algorithm.[96] The earliest full demonstration of morphing was at the 1982 SIGGRAPH conference, where Tom Brigham of NYIT presented a short film sequence in which a woman transformed, or "morphed", into a lynx."

"The first cinema movie to use morphing was Ron Howard's 1988 fantasy film Willow, where the main character, Willow, uses a magic wand to transform animal to animal to animal and finally, to a sorceress."

3D inbetweening

"With 3D CGI, the inbetweening of photo-realistic computer models can also produce results similar to morphing, though technically, it is an entirely different process (but is nevertheless often also referred to as "morphing"). An early example is Nelson Max's 1977 film Turning a sphere inside out.[46] The first cinema feature film to use this technique was the 1986 Star Trek IV: The Voyage Home, directed by Leonard Nimoy, with visual effects by George Lucas's company Industrial Light & Magic (ILM). The movie includes a dream sequence where the crew travel back in time, and images of their faces transform into one another. To create it, ILM employed a new 3D scanning technology developed by Cyberware to digitize the cast members' heads, and used the resulting data for the computer models. Because each head model had the same number of key points, transforming one character into another was a relatively simple inbetweening."

The Abyss

"In 1989 James Cameron's underwater action movie The Abyss was released. This was the first cinema movie to include photo-realistic CGI integrated seamlessly into live-action scenes. A five-minute sequence featuring an animated tentacle or "pseudopod" was created by ILM, who designed a program to produce surface waves of differing sizes and kinetic properties for the pseudopod, including reflection, refraction and a morphing sequence. Although short, this successful blend of CGI and live action is widely considered a milestone in setting the direction for further future development in the field."

Walt Disney & CAPS

"The late 80s saw another milestone in computer animation, this time in 2D: the development of Disney's "Computer Animation Production System", known as "CAPS". This was a custom collection of software, scanners and networked workstations developed by The Walt Disney Company in collaboration with Pixar. Its purpose was to computerize the ink-and-paint and post-production processes of traditionally animated films, to allow more efficient and sophisticated post-production by making the practice of hand-painting cels obsolete. The animators' drawings and background paintings are scanned into the computer, and animation drawings are inked and painted by digital artists. The drawings and backgrounds are then combined, using software that allows for camera movements, multiplane effects, and other techniques—including compositing with 3D image material. The system's first feature film use was in The Little Mermaid (1989), for the "farewell rainbow" scene near the end, but the first full-scale use was for The Rescuers Down Under (1990), which therefore became the first traditionally animated film to be entirely produced on computer—or indeed, the first 100% digital feature film of any kind ever produced."

source: https://en.wikipedia.org/wiki/History_of_computer_animation#Towards_3D:_mid-1970s_into_the_1980s