Ptex
By Neil Blevins
Created On: Apr 14th 2012
Updated On: Feb 15th 2021
This lesson will discuss the Ptex file format, what it is, it's
advantages and disadvantages, and why it has been slow to be adopted
more broadly by 3d software.
A Ptex file is an image file format not that unlike a tif, jpg or
exr, the main difference is it isn't a purely 2d file format, but
contains 2d pixel information on a face by face basis for a 3d model.
The format was created at Disney for use in their
proprietary paint3d software, but the format was open sourced, and some
other 3d applications have taken advantage of the format.
For a more technical discussion of Ptex, including the original paper
and usage videos, visit Disney's
Ptex Site.
2D vs Ptex Workflow
When dealing with normal 2d image formats, your 3d workflow would be
something like this:
- Model your 3d geometry
- Create UVs, which involves flattening the faces of your 3d
geometry into a 2d configuration
- Paint a 2d map that corresponds to your uv positions (either in a
2d paint app like photoshop or a 3d paint app like mudbox, mari, body
paint, etc).
- Wrap your 2d map around your 3d model using the uvs.
One of the main issues with this scheme is it takes a long time to
setup good uvs for a model, especially if it has a lot of separate
pieces of geometry. And it's just so darn
frustrating, it seems like there should be a way to paint on your model
directly without the need for a 2d intermediate. That's where the Ptex
format comes into play. Here's a common Ptex workflow...
- Model your 3d geometry
- Setup for Ptex (usually a button click or two)
- Paint on the the geometry using a 3d paint program
- The paint applies directly to the 3d faces of your geometry
Advantages & Disadvantages
So there are a number of advantages of using Ptex over the normal 2d
mapping method...
- No more uv setup, which means a lot of saved time
- No texture stretching. Setting up good uvs can be time consuming,
and even the best uvs sometimes lead to faces receiving too many pixels
compared to their neighbors, causing texture stretching to occur
- No artifacts at shell edges. When defining uvs, the edges between
uv shells frequently leads to artifacts. Not so with Ptex.
- Each face can have a different resolution, so it's quite easy to
add extra detail to a specific area of your model. More difficult to
set this up with UVs.
There are some disadvantages though...
- Some things are just easier to paint in 2d. Say you have a
rectangular cloth, it makes sense to apply a weave texture to your
flattened 2d uvs as opposed to trying to paint this onto the final 3d
representation of your object. Some software has ways of getting past
this limitation, like flattening your 3d geometry so you can still
paint in 2d onto Ptex.
- No photoshop support. Photoshop is the most common tool for
painting textures in the industry, but it does not allow you to paint
on Ptex files. As 3d paint programs like Mudbox, Mari, etc become more
feature complete, the need to use Photoshop to paint will probably
decrease.
- Changing the geometry of the model (like adding or deleting faces
or edges) requires you to bake the Ptex
file from the old geometry to your new geometry with possible quality
loss. This is a solvable problem, Mudbox for example has the ability to
transfer ptex from one model to another. But it is an extra step you
have to worry about.
- Ptex files are generally associated with their corresponding
object by name, so Hand01.ptx is assigned to the 3d object in your
scene called Hand01. So if you rename your model, you also need to
rename its corresponding Ptex file.
- Again, since ptex are associated to their object by object name,
if you have 200 objects that need an identical color map, its easier to
do that using uvs than to create and maintain 200 identical ptex files
that are named for each object.
The Ptex Format
So a Ptex file is a list of faces, and all the pixels (also
called texels, for texture pixels) on that face. For example, here's a
rock texture painted onto a
3d face...
This is what that texture map may look like applied to uvs in 2d...
And this is similar to what would be contained in a Ptex file...
As a 2d map, this is unreadable. But the 3d software reads the Ptex
file,
and knows which chunk of texels are applied to which faces, and the
result in 3d is the proper paint on the proper faces.
Painting And Baking
The two most common ways to use Ptex files in your workflow are as a
way to store 3d paint, and as a way to bake various properties into
your mesh.
- Baking: Say you want to bake an ambient occlusion pass into your
mesh, so that you can darken the cavities in your shader. Calculating
the occlusion on a frame by frame basis may be slow, so instead, you
calculate the occlusion, and then bake it into a Ptex file assigned to
your object, now you have your occlusion without the need to setup uvs.
This is great if you want to apply baked occlusion to say 20,000
objects in your scene without having to setup 20,000 uv sets. If your
mesh changes, or you need to add more meshes to your occlusion,
you can always go back and rebake, which is usually a quick process.
You can also bake the displacement
of your mesh into a Ptex file.
- Painting: You use a 3d paint program like Mudbox or Mari and
manually paint onto your mesh. This paint is saved as Ptex files. Since
these are hand painted (not part of some procedural process), changing
your geometry is a bigger deal. You'll need to either repaint, or else
bake
your paint from your old mesh to your new mesh, and then repaint any
parts that have changed substantially.
Examples
The best way to get into Ptex is to jump in and start using them, so
for those of you who own mudbox 2012-2013, here's a tutorial on the
Ptex workflow for mudbox: Ptex
Use In Mudbox 2013.
Here's some info for people who are interested in using ptex files
in vray for
3dsmax.
Ptex and Realtime Rendering On
Videocards
So now that you know a little about Ptex, why haven't we
seen more of it? I mean, it was developed over a decade ago! Why hasn't
it taken the world by storm? To understand why, we need to look at a
little history.
History Of Pattern Placement
Original "computers" were specialized machines that had a
single pre-programmed purpose. The machine was created for a specific
task, and if you wanted to perform a different task, you needed a brand
new machine. Then we saw the birth of punch cards and the "personal
computer", where the hardware was
more generalized and you could write "software" to do many different
tasks
with the same hardware. You could in fact program a computer to perform
new tasks that weren't even thought of when the hardware was first
created. That started a period of great flexibility. Then started the
growth of the computer graphics industry, and the start of the desire
to apply textures to 3d models...
- Procedurals: the first
major
way of applying textures to a surface was using procedural patterns.
Memory was at a premium, and so procedurals made sense since they took
up little memory.
- Bitmaps and UVs: As
memory increased and techniques
expanded, another popular texturing method appeared, a bitmap was
wrapped onto your surface using an atlas, or what we later referred to
as UVs. Using
bitmaps allowed for a larger variety of patterns than you could easily
achieve with procedurals, they were also frequently faster to render
and easier to anti-alias. Nurbs Surfaces and
Patches took over the film industry, whose UVs were locked to their
topology. But UVs for polygons (which were very popular in the video
game industry) allowed the artist to make their own atlas. Film
eventually moved in the same direction when subdivision surfaces
replaced nurbs and patches.
- Projections: Then there
was projections, the idea of taking bitmaps, but rather than
tying them to an atlas, you projected the bitmap onto your surface from
a projector source. This had the advantages of not requiring time
consuming UV setup.
And while the most common use was matte painting, it was also used
frequently to texture assets.
- 3D Paint: Around this era
we also saw
the first stabs at 3d paint programs, where you could paint directly on
your 3d surface. The results of your 3d paint strokes would get baked
to 2d UVs, saving the paint in a 3d format of some sort never really
caught on.
- Ptex: As time went on,
and our 3d scenes started becoming more and more
complex with bigger meshes and more objects, one thing was clear:
setting up good UVs on objects was a very time consuming task, and not
a lot of fun. That's when Disney brought Ptex into the mix in 2010. Now
each face gets its own uv space. You use either a 3d paint
program or bake a procedural texture onto your surface, and write them
to a Ptex file. The big advantage, no UVing necessary.
So now here we are at the present, we have procedurals, we have UVs, we
have
Projections and we have Ptex. But for the most part,
the most commonly used technique is still UVing objects and painting
the
bitmaps using either 2d or 3d paint techniques. Why is this technique
still the most popular? Weren't people excited when UVLess techniques
started
showing up? Don't UVless techniques help the artist achieve great
results with less work? Why are we still using UVs then? While many
causes could be pointed to, the strongest pull comes
from the
move towards Realtime Rendering, and the limitations that puts on
available techniques.
Video Cards Favor The UV Workflow
The holy grail in the games industry
is fast frame rates. The faster the frame
rate, the smoother the visuals, the better the gaming experience. This
is even more important now
that we are entering the world of Virtual Reality and Augmented Reality
with the Oculus, Holo-Lens, etc, where high frame rates aren't only
desirable but are actually necessary or else people become
ill.
Realtime graphics is the realm of the video card. In 2009 we started
seeing all this promise from "GPU rendering", and
the first few GPU renderers
could perform some rendering functions at incredible speed. A good
example is when mental
images' iray first
became available for 3dsmax, it was super fast. But the speed came with
a cost, you could only use
a small portion of the standard 3dsmax features. Even today, with V-Ray
RT and tons of other Realtime Renderers, we have the same problem,
they're compatible with more stuff than
they were in 2009, but
still incompatible with a large number of features. This caused a lot
of frustration, since
features people were
used to using, now they couldn't.
While CPUs allow for a lot of flexibility, speed
requires
less
flexible and a stronger focus on single purpose hardware. To achieve
the highest speed on a video card, many base functions are a part of
the hardware itself. The video card expects your 3d software to give it
the data in the way the video card wants, and deviating from that means
you don't get the fast frame rates.
Video card technology was for the most part driven by the needs
of the videogame industry, as they were their biggest customer. Now
mobile devices have a huge
say with the hardware manufacturers, as well as potentially the VR and
AR field. But these markets have a lot more in common with the gaming
industry than with film when it comes to technique and performance
requirements. Since gaming and related fields are the largest market
for videocards, and UVing is the
most common way of texturing stuff in
videogames, video cards are created specifically to speed up that
particular workflow, to the detriment of other techniques.
Ptex On Video Cards
Part
of the reason techniques like Ptex
(despite its advantages over UVs) has had trouble gaining ground is
because the segment of the
industry that's currently most interested in techniques like Ptex
accelerated on video
cards is too small (i.e.. the film market). The video card
manufacturers
aren't likely to
improve the Ptex workflow on their hardware unless there's
a lot of demand from their main customers (videogames, mobile). And
the videogame industry overall hasn't been pushing the issue for a
number of reasons...
- They have built so much of their pipelines and expertise around
UVs, a big change now would be difficult in terms of software, pipeline
and retraining.
- The old chicken and egg problem. Video cards aren't optimized to
allow for the Ptex workflow, so the video game folk don't even consider
it an option.
Since they haven't experienced the advantages, they don't ask for it.
And so the card manufacturers don't put it on
the video cards, hence video cards aren't optimized to allow for the
Ptex workflow.
- Lackluster authoring software support.
- Technical hurdles that would need to be addressed to see Ptex
used in
games (Quad workflows, Memory Overhead from Border Filtering, etc).
These technical hurdles are often given as the main reason for Ptex's
slow adoption, however, the people I've spoke to feel these
issues are all solvable if the will is there.
Here's an article from 2012 by Sebastian Sylvan called Casting
a Critical Eye on GPU PTex that contains a lot of
useful information, both showing many of the
technical issues that would need resolving to see Ptex work well on
video cards, and the comments section has a good discussion with the
Mudbox team (who made a Ptex implementation for hardware) where they
feel Sebastian didn't give Ptex a fair shake.
FX Guide posted an interesting article called UDIM UV Mapping
in May of 2014, discussing the advantages / disadvantages of UVs and
Ptex. The article was very pro UVs, but then 2 months later it was
followed up by another article called Ptex, the
other side of texturing, which got into more detail on the
advantages of Ptex, and the advances Disney hopes to see in the area.
To note, some research has indeed happened in making Ptex a
viable option on hardware by the hardware manufacturers themselves,
here's an article from 2013 showing that
Nvidia has actually made a
ptex implementation, at least in the R&D stage: Eliminating
Texture Waste: Borderless Ptex. AMD as well: Radeon
HD 7900 Series Graphics Real-Time Demos.
Asset Authoring "Realtime" vs
Game Engine Realtime
One other note should be made to distinguish the difference between
Asset Authoring "Realtime" and Game Engine Realtime. These two
different areas have different requirements.
- Asset Authoring
realtime would be "I'm inside my 3d paint application painting a Ptex
file in realtime".
While this needs to update fast, it may not require 60 frames a second
to give the user the results they need. In fact, it may be better to
not call this "realtime" at all, but simply "interactive". Even getting
results back in 1-2 seconds may give an asset creation artist the
feedback they need in some circumstances.
- But for the final product, the videogame engine itself (that uses
the assets), achieving smooth and high
frame rates is important. Real realtime is necessary.
Perhaps UVless workflows
including Ptex may work fine for the asset creation stage, just not for
the game engine stage. The idea would be to use UVless techniques while
you're making the asset, but
then bake the result to textures assigned to Automatic UVs while you're
using the asset in the final product. Some sort of Automatic UVing
exists in most asset authoring applications.
You get
UVs faster because you don't have to manually lay them out. But the
disadvantage is that UV layout tends to be messier, making
it difficult to paint the results in a pure 2d paint program, and
issues arise with
artifacts at the UV island borders,
which are far more numerous and may not be placed in the ideal spot
unless hand edited.
Here are some examples of using UVLess techniques at the Asset Creation
stage...
- Projections in Mari / Substance:
The painting application Mari allows you
to use projections (triplanar) and then bake the result to UVs. The
same with Substance Designer / Painter, a newcomer on the scene for 3d
painting in
games. Since the results of the projections are baked to UVs, you can
still
use the results efficiently in a game engine on current videocards. But
the disadvantage is
because they're
baked, if you want to change your projection, you have to rebake, which
may require manual work and computer time. And of course if you do
manual UVs to bake to, it takes awhile to UV, or if you use Auto UVs,
it has the disadvantages already outlined above. So this workflow
removes some of the advantages
Projections offer.
- Ptex in Mudbox: Autodesk
Mudbox
allows you
to display and paint Ptex in realtime on
your model while painting, but the way it does that is by create a
bitmap assigned
to your mesh using a sort of automatic uvset that mimics how a Ptex
file works, as opposed to directly displaying a Ptex file.
- Interactive Workflow, Bake At
The Last Minute: You could also imagine a workflow where you
work UVless for the entire
asset creation stage, and get back interactive feedback (like using
V-Ray RT for example inside 3dsmax or Maya), and then your last stage
before sending the model to the game engine is a texture baking stage
baking the textures to Automatic UVs. This could potentially save the
asset artist a ton of time (since they don't have to UV and can use
stuff like projections to quickly apply complex library materials to an
object), and still give you realtime results in your
game engine. Only disadvantage here is that if you make an asset
change, you have to wait for a rebake before you get the results in
your game engine. And of course the inherent Auto UV issues.
Conclusion
As you can see, this is a really complex issue, with a lot of moving
parts that are controlled by a lot of different groups, from customers
to software companies to hardware companies to entire industries.
Trying to get all of these things to align is a really tough job and
takes a lot of time. Realtime and interactive rendering has major,
major advantages.
And UVless workflows really help the artist spend more time on the art
and less time on the technical. But right now these two things don't
work as well together as we'd like. My hope is that eventually we will
be able to have our cake and eat it to. But to have
that, we need a push from all of the artists and technical folk in all
of the graphics
related industries. What we'd need to see...
- All industries (film, games, etc) ask the video card manufactures
to make the necessary changes to promote the Ptex workflow (or an
equivalent), from the
hardware to the SDKs
- More research into resolving the technical hurdles that slow down
Ptex, or else we'll have to keep Ptex purely as an asset creation
technique
- Better integration of Ptex into 3d authoring software like
3dsmax, maya, mari, etc
- More focus on other UVless workflows in 3d authoring software
Realtime Rendering isn't going anywhere, and it has HUGE advantages.
But it's sad that to have that advantage, we have to put up with the
disadvantages of UVs. The only way around this issue is with
your help. Can we
have flexibility AND speed in the future? My hope is yes. But we all
have to demand it together. There's no reason we can't have less focus
on UV workflows AND realtime, we just need the industry to put their
focus and dollars in that direction. And it all started with you
demanding it. SO join me in carrying the torch, and keeping the dream
alive.