Movie Distribution Sucks

The Big Six: Warner Bros., Paramount, Sony Pictures, Universal, Fox and Disney. A major film studio is a production and movie distribution company that releases a substantial number of films annually and consistently commands a significant share of box office revenue in a given market. In the North American, Western, and global markets, the major film studios, often simply known as the majors, are commonly regarded as the six diversified media conglomerates whose various film production and distribution subsidiaries collectively command approximately 80 to 85 percent of U.S. and Canadian box office revenue. The term may also be applied more specifically to the primary motion picture business subsidiary of each respective conglomerate.

The Big Six major studios are today primarily backers and distributors of films whose actual production is largely handled by independent companies—either long-running entities or ones created for and dedicated to the making of a specific film. The specialty divisions often simply acquire distribution rights to pictures in which the studio has had no prior involvement. While the majors still do a modicum of true production, their activities are focused more in the areas of development, financing, marketing, and merchandising. Those business functions are still usually performed in or near Los Angeles, even though the runaway production phenomenon means that most films are now mostly or completely shot on location at places outside Los Angeles.

Since the dawn of filmmaking, the U.S. major film studios have dominated both American cinema and the global film industry. Today, the Big Six majors routinely distribute hundreds of films every year into all significant international markets (that is, where discretionary income is high enough for consumers to afford to watch films). It is very rare, if not impossible, for a film to reach a broad international audience on multiple continents and in multiple languages without first being picked up by one of the majors for distribution.

The Technology of Movies

Films have been a vital part of popular culture for about 100 years. Though practically a photographic channel, motion pictures have in the past relied greatly on electrical, electronics and computer technologies.

Wired.com_.-Photo-by-Zen-Icknow
Wired.com_.-Photo-by-Zen-Icknow

In the time of silent motion pictures, cameras and projectors applied electric motors to attain constant film speed. In addition, electric lights were vital in both filming and projecting. In the early years in the studios, arc floodlights and Cooper-Hewitt mercury-vapor tubes were most essential. From the mid 1920s on, lucent tungsten bulbs became common, both because better incandescent lights were newly obtainable and because they did not display the visible humming that arc lights did. The humming had not been a problem with silent films. After about 1940, tungsten floodlights with reflecting surfaces on the inside of the bulb behind the filament became common.

A change developed in the film world in the late 1920s. In partnership with Warner Brothers and Vitaphone, Western Electric and Bell Labs produced Don Juan, a sound film showcasing John Barrymore. The movie premiered on 6 August 1926. Even though there was no talking, the music and some saber skirmishes were matched to the action. A year later, The Jazz Singer, starring Al Jolson, did have spoken lines, and its triumph caused movie producers to race to make sound movies.
Unexpectedly, electronics was critical to movie-making. Microphones changed sound to an electrical signal, which might be strengthened or otherwise processed digitally. A photocell allowed the photographic soundtrack to be changed back to an electrical signal. And all together with loudspeakers, amplifying tubes recreated the sound for the audience. Indeed, sound films brought about the mass generation of photocells, since they produced the first large market for such tubes.

Electric lighting– including neon and other colored lighting– as well as projection systems and sound systems were necessary to the triumph of the movie palaces of the late 1920s and the 1930s. Among the most well-known were the Roxy Theater– the “Cathedral of the Movie”– at Seventh Avenue and 50th Street in New York City, and Sid Grauman’s Chinese Theatre in Hollywood.

The 1924 introduction of an editing machine called the Moviola helped with movie production. This machine, which used a variable-speed electric motor and pedal control, was an invention that made it much easier to splice video footage together. As a result, movies became faster-paced, with a shorter average shot-length.

Multi-track sound was introduced in some cinemas for the first run of “Fantasia” in 1940, but thereafter was used little until stereophonic sound came in with CinemaScope and other widescreen formats in the 1950s. At about the same time, use of magnetic filming began. In addition to having much higher reliability than optical recording, it had the significant advantage of instant playback on the set. In the 1970s, Dolby noise reduction techniques came into use, both in movie-making and cinema presentation.

One of the most impressive improvements in movie-making in the past decade or two has come with using computer-generated imagery, beginning with Tron in 1982. Tron is the story of a video arcade owner and former programmer who becomes transported into the virtual world of the Master Control Program, the computer security system of a software company. The release of Toy Story in 1995, the first all-computer-generated feature movie, was another milestone.

A revolutionary change is already advancing, as computerized electronics take over from photography as the underlying motion picture tool. An important early trial of digital motion picture using a large-screen electronic projector, was Star Wars: The Phantom Menace in four Los Angeles cinemas.

Digital movies are interesting for several factors: the film can be saved and reproduced flawlessly (no wear or degradation as with movie stock); it can be distributed conveniently and inexpensively; and production costs for digital are lower than for celluloid techniques. However, the high cost of digital systems– some digital projectors costs $150,000, for example– still limits their use.

The Computer Generated Imagery (CGI) Era

Computer-generated imagery (CGI) is the application of computer graphics to create or contribute to images in art, printed media, video games, films, television programs, commercials, videos, and simulators. The visual scenes may be dynamic or static, and may be two-dimensional (2D), though the term “CGI” is most commonly used to refer to 3D computer graphics used for creating scenes or special effects in films and television. They can also be used by a home user and edited together on programs such as Windows Movie Maker or iMovie.

CGIThe term computer animation refers to dynamic CGI rendered as a movie. The term virtual world refers to agent-based, interactive environments. Computer graphics software is used to make computer-generated imagery for films, etc.

Availability of CGI software and increased computer speeds have allowed individual artists and small companies to produce professional-grade films, games, and fine art from their home computers. This has brought about an Internet subculture with its own set of global celebrities, clichés, and technical vocabulary. The evolution of CGI led to the emergence of virtual cinematography in the 1990s where runs of the simulated camera are not constrained by the laws of physics.

In recent years, the use of computer-generated imagery has been made in courtrooms. They are used to help judges or the jury to better visualize the sequence of events, evidence or hypothesis. But a 1997 study showed that people are poor intuitive physicists and easily influenced by computer generated images. Thus it is important that jurors and other legal decision-makers be made aware that such exhibits are merely a representation of one potential sequence of events.