GameMonkey Script

GameMonkey Script Forums
It is currently Sun Jan 20, 2019 11:39 pm

All times are UTC




Post new topic Reply to topic  [ 8 posts ] 
Author Message
PostPosted: Thu Nov 02, 2006 10:07 am 
Offline

Joined: Mon Dec 15, 2003 1:38 pm
Posts: 708
I have approval to publicly release a paper (~5mb pdf) that describes a novel approach to precomputed lighting. The research was done for a commercial program that needed highly accurate lighting, but would run on low spec video cards, hence using pre-computation. The main features are: 1) Statistically accurate global illumination, mainly for the non-specular component. 2) Low memory usage 3) Scalable run time, algorithms are nearly independent of scene complexity, instead they are dependant on surface area and sample resolution. The technique was inspired by photon mapping and shares some similarities with Instant Radiosity. This paper, unlike most academic papers, is devoted almost entirely to implementation details, describing encountered problems and solutions, in as non-technical terms as possible. Most importantly... it has pictures!

Edit: This is a direct PDF link as an experiment to help crawlers. Human users should prefer the above link or Downloads menu. I'll remove this later if it doesn't work.
http://www.gmscript.com/gamemonkey/down ... nique1.pdf


Last edited by Greg on Sat Feb 03, 2007 5:52 am, edited 1 time in total.

Top
 Profile  
Reply with quote  
 Post subject:
PostPosted: Thu Nov 02, 2006 2:54 pm 
Offline

Joined: Mon Jun 26, 2006 3:48 pm
Posts: 114
Location: Paris, France
Yeah I shall have a look. Sounds good.

Thank you for this release Greg.


Top
 Profile  
Reply with quote  
 Post subject:
PostPosted: Thu Nov 16, 2006 7:55 pm 
Offline

Joined: Mon Jun 26, 2006 3:48 pm
Posts: 114
Location: Paris, France
I'm not personally making any game or game related stuff ( I'm actually doing some research in compression algorithms in my free time of course) but I buyed years ago many books on game programming and found out that in nearly all of them, lightning was explained in few pages because I'm sure they didn't know good and fast lightning techniques.

And I must say that what I read was quite impressive. Did you find this out alone?


Top
 Profile  
Reply with quote  
 Post subject:
PostPosted: Sat Nov 18, 2006 4:32 am 
Offline

Joined: Mon Dec 15, 2003 1:38 pm
Posts: 708
Unfortunately I did work largely alone on this specific problem. I did have a few (and met one or two new) friends to run ideas past and talk tech with, which was great.

The project that lighting solution was developed for, was actually not a game (my primary passion). It had very specific hardware requirements ie. Very low end video card, 64mb Ram, 600mhz cpu. The lighting had to be realistic, calculated as a background process, and preview able in minutes, even if higher quality could take much longer. It also must not require human intervention, or increase art work load.

The solution was an extension of my previous precomputed lighting research, most of which was done for video games around the Quake2 era. I had experimented with Radiosity and Ray Tracing as well as various hybrids and plain hacks. I did a lot of research to try and determine the right way, or best choice. During this time I read tons of research papers, took an interest in Photon Mapping and implemented some prototypes.

When the solution was done and working, meeting all the requirements, my boss suggested I write a paper on it. I did that and said that since my research was built upon the knowledge of others, perhaps extending it a little, or using various algorithmic tools in a novel fashion, it was only fair to share this knowledge back with the community who freely gave in the first place.

I did not think there was specific intellectual property worth keeping secret for financial gain, and even if so, the information should be shared as soon as possible after that. I will refrain from ranting about software patents other than to say that 1) a heavy and enforced patent system for software would have prevented computers and applications as we know them from existing. 2) many individuals or small groups of people, given a specific problem to solve will eventually arrive at the same or similar solution, why should the first prevent the next from doing so also? Please don't respond to my opinions on this matter, if you agree, fight against software patents at every opportunity.

Modern games are increasingly performing lighting in real time and don't have archaic system requirements like that project. For that reason, the overall technique is of diminishing value to game developers. I think that coders trying to solve similar problems may find some of my implementation details of interest, and I think there is definite potential for the core concept of automatic light placement, which can simulate ambient light, to be implemented as a plug-in for modeling apps like 3DStudio Max or Maya.

Another quick rant to bring up here, is that in my opinion, the future of real time rendering is NOT ray tracing. Classic ray tracing is only part of the global illumination solution. Modern video cards with pixel shaders, are almost doing ray tracing, perhaps without the tracing part ;) Pretty much all current realistic renders (the kind you see making the latest movie like Shrek or Titanic) use a hybrid system, like ray tracing, but with photon maps doing a 'final gathering' phase for diffuse/ambient light and caustics. Unsurprisingly, ray tracing and radiosity type techniques in recent years have been accelerated by none other than rasterization and screen space techniques. People think that ray tracing will scale, eventualy you could have 1 cpu or 1 whole computer processing a single pixel, but all working in parallel. This is true, BUT the 'slow bit' of ray tracing is testing visibility. This is often as simple as testing if one point in space can 'see' another point in space, and is used for tracing shadow feelers back to light sources, to check for shadows (light influence), or finding reflected surfaces. The time spent testing visibility is not constant or easily predictable. A whole range of spatial subdivision and search methods have been developed to speed this process. Rasterization achieves this by doing large amounts of work, but largely limiting calculations to screen space and light space, with the aid of things like stencil and z-buffers. So, I'm basically saying that ray tracing has two major handicaps that prevent it from being the scalable solution of the future 1) in itself, it's only part of global illumination 2) in itself, and the hacky work arounds to (1) are both restrained by visibility determination. Rasterization on the other hand, will scale better, for the foreseeable future. Having said that, rasterization will often compromize accuracy for speed. For example, ray tracing in a heavily reflective scene with many light sources, can produce highly accurate results. Reflectivity in rasterization often involves rendering the scene into a cube map from specific points in space, then mapping that onto a nearby surface. What we're likely to see is raytracing hybrids continue to be used for movie production, with hardware assistance, but not be used in games (which is obviously the context I care about).


Top
 Profile  
Reply with quote  
 Post subject:
PostPosted: Sun Nov 19, 2006 7:43 pm 
Offline

Joined: Thu Jan 01, 2004 4:31 pm
Posts: 307
You should submit this to writers@gamedev.net


Top
 Profile  
Reply with quote  
PostPosted: Sat Apr 05, 2008 3:15 pm 
Offline

Joined: Sat Apr 05, 2008 3:02 pm
Posts: 1
Hi Greg.

I'm currently working for a games company and I'm trying to improve the look of our lightmaps. I stumbled upon this paper via google and found it to be very informative and useful as I've been scouring the web looking for techniques to improve the solid-space bleeding problem and the texture seams problem.

Since you wrote the paper, have you had any further insight into the seams problem? I've tried a couple of techniques myself yielding some improvements in certain cases but I've not found a general solution, if one even exists that is! :lol:

-Rob


Top
 Profile  
Reply with quote  
PostPosted: Sun Apr 06, 2008 3:12 am 
Offline

Joined: Mon Dec 15, 2003 1:38 pm
Posts: 708
The solid space problem is one of those feature size things, where the relative size of texels to world space, and position of sample points matters. The solution I used and believe I described was close to as good as it gets, that is, attempt to detect polygon intersections, solid space and outside faces, and sample accordingly. To improve further would involve more polygon preprocessing, such as CSGing the scene, removing polygon garbage, or forcing the user to create their meshes so the result is a skin world. I've always tried to minimize restrictions on artists and designers, so I end up trying to imply topology from polygon soups and clean up the mess best I can.

The texture seam problem, I don't believe is solvable, despite claims I've read. In every case I've read up on a claim of eliminating seams, they have either not done so, or introduced new artifacts such as stretched texels, or texels that do match up, just not when blinear filtering (and other real world sampling techniques) are applied. I think some of the best results come from more complex UV unwrapping schemes. The hand directed, or visibility directed splits are also interesting (that is, where a mesh is split, but the splits occur at least visible locations like the inside of a cows leg, if the model was that of a cow;) ). You simply can't map a round ball onto a rectangular surface without some compromise. There are a bunch of free and commercial tools, some plugins for 3dsMax and Maya, others distributed by nVidia and ATI (AMD), which help solve the unwrapping problem. (I was just tring to find an interesting paper I read a few months ago which contained a novel unwrapping technique using voxel patterns or something, but I can't find it. Perhaps I'll post it later if I do find it.)

..And welcome to the forum rjessop :)


Top
 Profile  
Reply with quote  
PostPosted: Sun Nov 11, 2012 8:25 am 
Offline

Joined: Mon Dec 15, 2003 1:38 pm
Posts: 708
Did a little talk on this photon mapping technique some time ago.
Power point slides are available here: PhotonMapping.pptx
Might suit to those who like pictures and don't want to read much ;)


Top
 Profile  
Reply with quote  
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 8 posts ] 

All times are UTC


Who is online

Users browsing this forum: No registered users and 1 guest


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
cron
Powered by phpBB® Forum Software © phpBB Group