Impossible.


Question: What do the following ideas have in common? Answer: They are all impossible, and they all intrigue me.
As a computer scientist I love all kinds of algorithms, but this site is dedicated to visualizing and understanding impossible phenomena though the use of innovative simulations and visualization algorithms.

Hypertime

Hypertime is a term I coined durign a casual chat with Dr. Robert P. Burton in October 2003 to describe environments with multi-dimensional time. That hypertime is “impossible” was first demonstrated in a paper by Max Tegmark; that it can be visualized is demonstrated by two applications of my own design.

Hypertime is probably most easily understood by thinking about a movie player that has a time plane for the slider instead of a time line. From this view it becomes clear that we are not discussing some sort of ill-defined “alternate dimension” idea, but a mathematical replacement of the scalar time with a vector time. The challenge, then, is to generate and present the content of the hypertemporal movies.

My first method for doing this is to simulate elastic collisions. An executable for doing this will be made available on this site soon. Becasue of the nature of the simulation, time may be navigated any way you wish, without worrying about such things as past and future. That this is an accurate simulation of elastic collisions, as well as the method of simulation, will be explained in my forthcoming master's thesis.

My hypertime simulation algorithm is designed based on a simple artificial inteligence model allowing simulated agents to be given several position goals and avoid bumping into one another. Again, the executable will be made available soon, and the details will be given in my thesis.

Curved Space

One of the most common “magical” effects encountered in fiction is the idea that a building with a small exterior can have a large interior. These effects can be explained by assuming the three dimensions we experience are the surface of a higher-dimensional manifold. To understand this, think of a 2D person wandering across a table top. By creating a mushroom-shaped lump on the table and enclosing the base of the lump with a wall, a small wall encloses a large 2D area.

However, the use of a manifold is constraining, as the resulting manifold must be continuous in some world of unspecified dimensionality. Rather, why not model exactly what is described—an area which is larger than other areas? With this model, we can map out the size of each point in space and work through what that implies.

Now, suppose you are walking along the boundary between a large area and a small area; say, with your left foot in the large area, your right in the small. For every yard on your right there are two yards on your left; so to walk in a “straight” line you have to move your left leg twice as far each second as you do your right. Of course, a photon wouldn't know to that, so light would instead “curve” into the larger area. I put “straight” and “curve” in quotes because from the perspective of a resident of such a space, the light would appear to be traveling a stright line and you, pivoting hard to the right, would appear to be folowing a very curved path.

I have worked out the math of what a straight path in such a world looks like and generated an animation of a camera zipping through a very sparse forest. The ground is colored to give a feel for the volume of the space through which the camera passes; red ground has roughly 1/3 the size of blue ground. Wonder what the path taken by the camera looks like from a flat-space perspective? Here it is with the camera moving from the yellow end to the cyan end over the course of the animation.

This brings us back to the opening thought that a hut can have a larger interior than exterior. What does it look like to have a house that is larger inside than out? How about smaller inside than out? Please pardon the grainyness of these animations; there were made with an earlier and more cude version of the algorithm than the forest scene.

If you think it would be cool to have a real-time curved space navigation program, you’re not alone, but so far I have only gotten the rendering process down to about 200 pixels per second per GHz of the underlying machine. Still, algorithmic challenges are my bread an butter, so I hope to improve that with future versions.

 


 

To see other work by Luther Tychonievich, try this site, devoted to writing just for the fun of it, and this site, a little demo of some of the output of my ocassional coding just for the fun of it.

A cool picture (but big; nearly 1Mb) of an infinite Penrose-style pentagonal tiling taken with a spherical lense (implemented as an OpenGL vertex shader).