• 3D Trailers
  • AwareWare
  • Life-Log AI
  • Web2Win4exe

 




 



 

Real or Fake 3D?

     BinaryVision renders real 3D



"While film studios are cashing in on 3D films, many are 'faking it' by converting 2D movies into 3D post-production. Worse, they're not upfront in their advertising, and many people feel ripped off after paying extra fees for the '3D Experience.'" - Philip Dhingra (realorfake3d)

When shooting a real 3-dimensional movie, two camera lenses are paired to capture views of the same scene from slightly different angles (positioned about six and a half centimeters apart). The difference in viewpoint between the left and right lens is known as parallax. BinaryVision brain wave-patterned algorithms merge the left and right views of two paired inputs to render 3D videos that can be watched without special glasses (autostereoscopy).

Create 3D videos without a camera.



Watch 3D videos without special glasses.

BinaryVision software can also convert between conventional 3D video framepacks. It supports industry-standard display protocols including side-by-side, half-width, anaglyph and more.

Above are 3D scenes from: "The Legend of Hercules," (2014). Below are 3D scenes from: "Jurassic World," (2015). Displayed here for educational purposes only.





 



 







 
(Above are some 3D demo trailer clips from LG.
Displayed here for educational purposes only.)

Introducing 3rdEyeVideo

The first 3D viewing devices or Stereoscopes were made in the 1800s by Sir Charles Wheatstone and David Brewster. The View-Master toy evolved in the 1950s and 60s. The Google Cardboard viewer stirred interest in Virtual Reality (VR) applications.

Stereographs (also known as wigglegrams, piku-piku, wiggle 3-D, wobble 3-D, and wiggle stereoscopy) are still images that simulate a 3D effect by displaying two views of a scene for the left and right eyes. Wigglegrams are left and right single image pairs alternating in a short animated loop:




 


 


BinaryVision extends 3D filmmaking with innovative GPU-accelerated 3rdEyeVideos (also known as wigglevids) based on brain wave frequencies. By intermixing input streams, its algorithms can render full motion videos rather than still images:



3D scenes from: "Avatar: The Way of Water," (2022).
Displayed here for educational purposes only.

"Stereographs on Steroids"


 

 










Imagine that....

A global social network holds the keys to a so-called immersive metaverse. Each user in that augmented world is required to put on cramped virtual reality headsets or awkward 3D glasses. Hour upon hour (day after day), in the classroom and in the workplace.

How long can you wear a head-mounted gadget on your face?

Now imagine why that won't happen

BinaryVision is based on brain wave autostereoscopy and doesn't require head-mounted viewing devices. Its full motion 3D videos (or 3rdEyeVideos) may be displayed on existing monitors, smartphones, TV sets, and moving picture screens. Unlike with 3D headsets, wigglevids are not experienced simultaneously by both eyes, but sequentially by the brain. It therefore takes a few moments of scrutiny before left eye and right eye visual information may converge into a real 3D view. Instead of a pop-out effect, 3rdEyeVideos sink in.

As an added benefit, users blind in one eye may also observe exciting stereoscopic video scenes because 3rdEyeVideo frames are interfused in a 3D sequence before the mind stores the information. BinaryVision brain wave frequency applications open new opportunities in telemetry science.














Michael Jackson - Thriller (This Is It 2009). Displayed here for educational purposes only.

 


Be Persistent

"If you find the 3D effect does not come at first, be persistent. Many people take a few minutes to discover it , but still find it perfectly vivid when they succeed." - John Frisby, Professor of Psychology, University of Sheffield (Illusion, Brain and Mind)

 

 


 

Along with lens focus, real 3D videos also have a stereopsis anchor point.

The 3D convergence area (stereopsis, or stereoscopic sweet spot) is usually within the two-lens overlapping field of focus. An intersecting ellipse (vesica piscis) allocates binocular vision (%) and appears three-dimensional with little or no rocking effect or wobble inside the convergence region (the subject's face in this example).


YouTube 3D pop-out influencer: 3DN3D @1tompo1 (2012). Displayed here for educational purposes only.

 

If you alternately shut and open each eye, your sight will rock from left to right and back again. But with both eyes open, your brain directly aligns left and right image parallax within a precise 3D sweet spot. (The rest of your visual field shifts out of register but goes unnoticed and blurs into the background.) As your eyes wander, your brain continues to align and evenly merge the corresponding left and right images within your moving 3D region of interest (ROI).



 



Select a 3D sweet spot (or stereoscopic anchor point) and align left and right views for both eyes to see the same object as one image.
The region of interest (ROI) sets a convergence difference in the visual angles that create a perception of depth.
Align double images and remove parallax 'ghost' image to decrease wobble.


 

BinaryVision software requires two video input sources or a 3D framepack to generate a 3rdEyeVideo :



(Left view and Right view, or a conventional 3D Framepack: Spatial, Side-by-Side, Top/Bottom, etc.)
 

 
The alternating left and right visual field (fusion frequency) of a 3rdEyeVideo is calculated in cycles per second, or Hertz (Hz) :
 


BinaryVision algorithms automatically simulate brain-wave frequencies to mimic a wide range of mind states.
To intensify or subdue a 3D experience, the wiggle or rocking effect can vary from hard to soft for each video scene :

 
 



(Below are some random 3D movie trailer clips. Displayed for educational purposes only.)






Ultra fusion frequency

Adrenaline rhythm
Ready Player One (2018). Displayed here for educational purposes only.














 







3D scenes from 'Spider-Man: No Way Home' (2021)
Displayed here for educational purposes only.

 

 

 

Why wasn't the metaverse populated as intended?
Because most users couldn't create their own 3D content.
Existing stereoscopic video editing systems were too costly or far too difficult to master.
The metaverse diminished into a 3D brochure for big corporations with hardly any user content.




3D trailer from James Gunn's 'Superman' (2025)
Displayed here for educational purposes only.

 
 

 

Procedure #1.



Create 3D videos without camera lenses.
BinaryVision walks you through the simple steps of using AI video generators and making depth map framepacks.
Watch 3D videos without special glasses.
Drop your framepacks into the 3rdEyeVideo Sequencer to watch without headset or 3D glasses.
 

Use authorized platforms to download 3D framepacks.
 Frame Packing refers to the combination of two frames, one for the left eye and the other for the right eye, into a single packed frame. With frame packing using the Side-by-Side 3D method, each sub-frame maintains full resolution. With the Top-and-Bottom 3D frame packing method, sub-frames are stacked vertically. The Anaglyph format requires red-cyan glasses. 3D-enabled TVs, VR headsets, and home theater systems have grown in popularity. 1. 2. 3. 4.
 

3rdEyeVideo - Top 3D Movie Trailers  


 

 
Procedure #2.

Shoot your own 3D movies.
Budget action cams are now considerably low in price and even youngsters can create wigglevids with BinaryVision.

 To render a 3rdEyeVideo from 2 inputs, simply trim your left-right start times (and optionally set a 3D anchor point) :

 


 

BinaryVision is not only about making 3rdEyeVideos.


 

It's a feature-rich stereoscopic conversion engine with filters and effects.
Simplify the task of making 3D video framepacks and optimized anaglyphs.

 

New: Convert Vision Pro Spatial videos to 3rdEyeVideos and watch 3D without the headset.

 


 

   3rdEyeVideo - Top 3D Movie Trailers   

 


The public launch of BinaryVision is coming soon...

 
3rdEyeVideo © 3D Brain Wave Algorithms copyrighted by Peter Fotis Kapnistos


© copyright 2025, all rights reserved

Coming Soon: 3rdEyeVideo algorithms will be made available as open source (filter_complex_script) for developers to include in their apps.
 
 

3rdEyeVideo 3D Brain Wave Frequency Simulator :

Gamma rhythm - very fast scene movement Intense concentration 30-100 Hz
Beta rhythm - fast scene movement Alert activity 13-30 Hz
Alpha rhythm - normal scene movement Relaxed creativity 8-13 Hz (default)
Theta rhythm - slow scene movement Daydream meditation 4-8 Hz
..........
Hyperfocus - dynamic movement Adrenaline (ultra fusion frequency)
Delta rhythm - dark scene with edge detection Deep sleep 1-4 Hz (Not endorsed as 3D)

..........

Example:
TRANSITION (FADE IN THETA, CUT TO GAMMA)
Transitions are placed in the bottom right of a screenplay page.