This is a discussion with Ken Musgrave and David Kirk sometime last year. We were talking about what it takes
to make high-quality procedural textures in hardware.
> Actually, I think that bicubic magnification
filters and 16-bit or floating > point per-channel frame buffers would be
a nice step towards this. You get > the advantages of incremental
computation & linearity, rather than an atomic > operation like
hardcoded Perlin noise. These features would allow a wide > variety
of multitexture effects. I don't know if this would be presented
as > a backend "pre-rendering" pipe, or if it would be the same as the
main pipe. > > Architecturally, current graphics hardware also has
some performance > problems with Paul Diefenbach's pipeline rendering
methods (recursive > rendering using textures -- see > http://www.openworlds.com/employees/paul/index.html), or even animating > texture content algorithmically per
frame. (Texture uploads or using 3-D > output as a texture map in
the same frame.) > > Finally, and important for flexibility,
downstream bandwidth is also a big > deal for us. Being able to
bitblt from video to system RAM lets us do all > kinds of high-quality
filtering and procedural textures that we can't do > now. For final
renders (where we have to evaluate procedural textures and > use good
filtering), we're finding it more efficient to render in software > than
to incur the overhead of reading from video memory. > >
mike > > ----- Original Message ----- > From: Ken Musgrave
<musgrave@metacreations.com> > To: David
Kirk/NVidia/US <dk@nvidia.com>
> Sent:
Thursday, July 29, 1999 4:31 PM > Subject: Re: Future of real-time
rendering > > > > >Things are pretty good here -
bringing 3D graphics to the mass markets is > > >fun!
Millions > > >of Polygons for Millions of People! >
> > > Yes, working at MetaCreations is much the
same--bringing graphics > > technology to the masses for cheap!
Gotta love it; it really affects some > > poeple's lives (for the
better sometimes, even...) > > > > >People use "procedural
texturing" as a catch-all, as if it means a single > > >thing.
I'm not really > > >sure what you mean by it. If you could
give me an equation of what you > > >want, I might be able >
> >to offer something interesting. > > > >
The Perlin noise function is the best example. Though sparse >
convolution > > noise might be even better. > > >
> When I say "procedural textures," I mean textures like I
present in our > > book "Textures and Modeling: A Procedural
Approach." > > > > >Where we are right now is putting
upwards of 20 million transistors on a > > >chip (that costs
about > > >$20 these days) and implementing an entire 3D graphics
pipeline, in > > >Infinite Reality ballpark of > >
>performance. Next generation will have nearly twice as many
transistors, > > >and will come along in > > >about a
year. We have enough transistors to do almost anything that >
people > > >want, but have to prioritize. > > >
> Hey, that's great! It really must be exciting to be at
the helm of > > bringing such power to the marketplace. >
> > > >If I take your request literally, you're asking for the
equivalent of > tens > > >to hundreds of Pentium-III class
instructions > > >executed in a single cycle, to be used as an
elemental building block. > > >While we could build special
purpose > > >hardware that did just that, should we? You could
accomplish almost the > > >same thing by using multiple
textures, > > >and doing some LOD math. > > >
> Probably. But I've always been fascinated by the emergent
behavior of > > pseudorandom texture functions of high
periodicity. > > > > What I intend to engineer, as
the years pass, is a universe of galaxies > > with solar systems with
planets with potentially unlimited detail, that is > > worthy of
exploration do the to the serendipity embodied in the procedural > >
models. It's not an end in itself, but rather a context for
content. > > > > See the attached extended
abstract. > > > > I want to do in reatime (without
the bad frames) what you see in the > > animation: > > >
> http://www.wizardnet.com/musgrave/gaea_zoom.mpeg > > > > It's only a matter of engineering. ;-)
(E.g, there will probably be > > floating pt. precision problems in
creating such a universe.) > > > > Once we have
the technology, the artist in me can go nuts creating new > > worlds
and "features" in them! Like "breeding" planets, using genetic >
> programming... > > > > -Mo > > > >
>
---------------------------------------------------------------------------- >
---- > > > >
-------------------------------------------------------------------------- >
> Ken Musgrave >
> MetaCreations >
> http://www.metacreations.com/people/musgrave > > (805) 689-9222
(c) > > (805) 684-6774
(h) > > (805) 566-6331
(w) > > > >
Arbitrary mayhem at the interstices. > >
-------------------------------------------------------------------------- >
> >