Forum FAQ Thread
Please also refer to the art and pipeline FAQ on our forums. If you still can't find what you need, try searching the forums for the issue. If you still can't find the answer, post a new thread about it!
Why is my vertex painted character or asset rendering as if painted black?
There are two potential issues to that can cause this:
- Exporter Settings
- Vertex Painting (Max)
One primary reason for this is that the exporter must have exactly the right settings to function. These settings are established each time you export by the export script. This script first loads a .GES file with the bulk of the settings appropriate to the type of export. Then it adjust additional settings based on the circumstances.
If the export script cannot find the .GES files, then the appropriate settings will not be set. The location of these .GES files is defined in the grannyu.ms script and must be adjusted for your particular installation. If there is an update to the grannyu.ms file, it is sometimes easy to forget to readjust this script.
Version 2.97 exporter and above will not let you export if this path is not set correctly.
If indeed you are on a older exporter still, then you must change the path in the granny.mu script located in the appropriate maxscripts folder with in the GrannyExportSettigns folder. They must aim to where the .ges settings are located, which is generally the root of the GrannyExportSettigns.
Vertex Painting (Max)
The second reason in which geometry can end up black is if it was vertex painted black. This is a common problem due to the fact that Max can easily lose track of vertex paint color channel information thus leaving the model all 'black'. Sometimes a Max model can get corrupted when moving between version or through other formats such .fbx. It makes it hard to spot since Max doesn't make it easy to view the vertex paint. To test if this is a problem, try vertex painting the model white and re-exporting.
If all else fails try resetting the XForm.
What causes the "ConvertGrannyAnim failed" error?
In the case of this particular error (see graphic at right), it was caused by bones in the skeleton having a non-uniform scale.
This can be corrected by using the 3ds Max Bone Edit Tools to make a clean skeleton.
What should we do if we accidentally exported meshes as static assets and not morph meshes?
Because the meshes were exported as static assets, your tangent merging tolerance was probably defaulted to 75 allowing for some unseen topology variation (whether vertex count or simply vertex ordering) to occur on export. The morph mesh export option is designed to eliminate any potential for variation such as this. Discrepancies in the meshes occurring as a result of that tolerance setting would explain why a log file might show "0 file(s) copied" following the OBJ>TRI process, or "Unable to load master TRI file" errors. Using the morph mesh export option should solve this problem for you.
If you were working with (for example) 6 different control meshes, your log file should display a "6 file(s) copied" message at the end of the OBJ>TRI process, followed by a message that states "5 diff morphs converted to EGM modes" after it converts TRIs to EGMs (one less than what you inputted because your 'base' mesh becomes the 'InternalMeanFace' in Customizer jargon, and everything else is considered a diff morph variation of that). When these numbers are what you expect them to be, it's a good indication things have gone smoothly.
By the way, you shouldn't need to export to both HGM and OBJ format directly from Max. For the purposes of setting up your facegen control meshes (those meshes you'll use to create your morph extremes), we recommend exporting to HGM format using the morph mesh export option, then converting those files to OBJs. Place these OBJ files in your \HE\FaceGen\male_body\egm\objs folder, edit your add_morphs.bat file accordingly and then run process.bat.
For the control meshes, you can export directly to OBJ format, but we've had better luck keeping scale and orientation correct among the components by exporting via the HeroEngine Export tools first, then converting. This may seem like an extra step, but all of your character parts that will eventually need to run through geometry integration will also be converted to OBJ files in the same way, for the purposes of generating corresponding EGMs..., so in the interest of remaining consistent, we prefer that approach.
If there are still errors after trying the above, it is possible that your template_directory may be out of date. There should be a pre-existing egm\tris folder. CustomCtl.exe, dummy.egt, and makeCtl.bat should all be found in the ctl folder before you begin, and he si.ctl file should be generated when you run process.bat. If you are unsure if you have all of the needed components, contact the HeroEngine team about getting an up-to-date template directory.
Why am I unable to import textures?
The error is generally indicative of an "art server" setup problem Art Server Setup, the engine was unable to find the texture based on the path embedded in the exported asset.
Textures in HeroEngine are not embedded with their assets as some engines do but rather the asset specifies the filepath in which the texture can be located in the repository. By doing it this way, it is easy to create textures and texture libraries that are shared by many assets greatly reducing the number of textures the engine needs to load to render an area.
If you are using an Evaluation World then the path should include */HJ/* The reason for this is that the Evaluation World leverages artwork from our own game Hero's Journey and consequently is configured to have the preprocessor look for */HJ/* in the path to strip the drive from the path.
Alternatively, the following example should help.
Lets say for example you want to export a box with the texture demo.dds.
First and foremost always export and aim your textures through a mapped letter drive. This reduces a lot of potential problems right away.
Second if your destination path on the server is: hjdev/world/export/demo.dds then your local or network path should look something like the samples below, with the highlighted area as the part that matters most:
The drive letter doesn't matter, it can be mapped as anything.
As a reminder, export through a letter drive only or the export will fail.
Also remember that the path is case sensitive.
- See also: Art Server Setup
Why is dynamic sky unaffected by fog?
The issue is related to fog alpha, in essence when fog alpha is 0 the sky horizon color complete overrides fog. At alpha = 1 the fog color completely overrides the sky horizon color. Most of the time you want something in between which results in a blend.
Why does the character manager get stuck/grayed out when I try to load my character?
The issue is that the character manager UI is waiting for the character to be "ready" and that is not happening because some of your parts use materials referencing textures (dynamic) that do not exist in your repository.
What is a material redefinition error?
Material [new_quilmore_house_01] REDEFINITION on [\world\livecontent\resources\quillmore\quillmore_dummy_house_04], first defined in [\world\livecontent\resources\quillmore\quillmore_dummy_house_01] Reason: PolyClass COLLIDE != WALK
When more than one 3DS Max file uses the same name for a material, the material has to be identical in every setting: the textures used, the checkboxes and radioboxes, etc. When HeroEngine loads mesh files it registers the materials it sees by the name given. If the name is already there, then it checks if the settings of the existing material of that name are identical to the one being loaded... if they are not, you get a material redefinition error (and it tells you what setting was different).
Keeping materials consistent between 3DS Max files is a major pain in the butt. Autodesk has not provided any mechanism that makes that easy to keep straight. Some game engines reverse things by having materials defined in the game engine and then reflected back to the art tool instead of the way HeroEngine works. And, in fact, I believe we are moving in that direction specifically because of this issue.
How do I create my own Shader effects in HeroEngine?
(As of May 2009) With the caveat that anything forward looking is subject to change....our major effort this year on the Rendering side is a total refactoring to support a variety of features including easier editing of shaders.
The current system does not directly support adding new shaders. That being said, I believe new ones can be added by modifying the Max HeroMaterial script and adding new ones to the dropdown box. I have to check and see the engine uses that directly or if there is some translation step in the engine that would have to be modified. If the latter is true, then you'd have to fork the source code to add a new one.
In either case, if the shader needs to do anything that takes different inputs that the engine currently supplies, then you'd have to fork the code as well.
The refactoring will solve this problem, and provide many other enhancements.
Essentially the plan is to refactor the entire rendering system. This is being done in stages, and that work is already underway. We hope that each stage will incrementally provide benefit (or lay groundwork for it). For example, the texture system is being refactored right now and supports many new features such as texture sources (which can be plugins, to support things like Pro FX, etc.) and advanced video memory management and downsampled preload and proxy texture support.
Part of our design goal is to refactor each part as a plug-in. If successful, the end result is that the rendering system will become a "constellation of plugins" which provide opportunities to change out different aspects to suit specialized needs without forking the source code. This latter goal is ambitious, but goes a long way towards future-proofing the engine. And, at the same time, lets us test individual sub-components external from HeroEngine and provide alternate implementations.
We have already refactored the visibility system into a plugin so that it can use either dPVS or Umbra, and can support any other approach that conforms to the interface.
To support custom shaders, there will be a major refactoring of both the material system and the shader system. These are both in the design phase right now, but the goal is that they will be data-driven so as to be as flexible as reasonable for introducing new shaders. A graph-based editor capability is part of the plan. However, one must keep in mind that with any such system, there are many pitfalls... they are not the nirvana many make them out to be. For example, people using Unreal 3's shader editor often end up creating unmaintainable, poorly performing messes that have to be straightened out by hand-coding shaders with the help of experts.
Never-the-less, the approach does have its merit and we want to make it possible to add shaders to the engine to customize the effects and looks you get. We are unlikely to create our own node-based editor, at least at first. We are looking at solutions like mental mil may be more appropriate since it can be customized and is very powerful right out of the box... this would get us to the finish line a lot faster.
The general order in which the refactoring will take place is:
Visibility (done) Texutre Manager (done) Render Scene Graph (actively working on this now) Material System (in design) Shader System (pending design) ..etc..
In parallel with these we are refactoring the low level graphics driver to be platform agnostic. Currently HeroEngine's client talks directly to DirectX9. We are implementing a graphics driver layer that abstracts this out so that the client doesn't talk to any specific graphics API. Instead this is supplied by an appropriate driver plugin. Ultimately the goal is to enable plugins for DX10, DX11, and eventually platforms like Mac, XBOX 360, PS3, etc.
Not mentioned here is a refactoring of the animation and other systems into plugins. The exact scheduling is not determined on these yet, however.
We hope to release these refactorings in stages, although I can't be specific on timing. I hope this gives you a general idea of the direction we are heading.
Why does the light from my lights go through walls?
The behavior you experience is the expected behavior. Dynamic lights (omni/spot) do not cast shadows nor are they blocked by geometry.
If you need to ensure they are restricted to a single physical room (i.e. the kitchen), you can add that geometry to its own "Room" Rooms and set the light's property "Restrict to Room" to true. Optionally, you can adjust the range of the light so that it does not extend beyond the "walls" of a particular volume.
As a general note, lights are relatively expensive so it is best to use them judiciously where it makes a difference and utilize "area" lighting to supply the majority of the lighting for your areas. Additionally, you can often use particle effects to achieve a look and feel of a light that renders significantly faster than using a light.
Can I programmatically control the camera?
Cameras are fully under script control, allowing you to implement a top down camera with the behaviors desired for your game. The _ExternalFunctions script on the client contains the external functions for working with cameras.
Documentation can be found on the wiki at: Camera
Why does my extremely high poly 3d asset not render correctly?
This is not an engine issue with HeroEngine, rather the issue is that you are blowing the buffer index in Direct3D.
Programs such as Max get around this by allocating exceedingly large indexes, allowing you to create an impractically large objects with poly counts that are unworkable from a game engine perspective. Large buffer indexes are prohibitively expensive (processing/memory) and game rendering engines do not normally support objects with more than about 60k polys for any individual object (generally a good 3D modeler can get characters to less than 10k polygons and most objects are less than that).
Additionally, the rendering of a single object with, say, 190,000 polygons has different performance characteristics than rendering 19-190 objects with 1000-10000 polygons (a much more realistic scenario for an MMO). DrawCalls and SetTextureMaps are of course much more expensive than the rendering of a individual poly.
How do I create custom shapes for the Physics panel?
- Main page: Collision representation
The physics representation of an object is automatically generated by the engine for you without any need to for your artists to do anything. This automatic generation is done based on a specification, and it is that specification that you can modify using the Physics Panel. One of the significant benefits of creating physics representations for your objects by specifications is that when your artists iterate on a model, the physics representation is automatically re-created using the specification you have specified, so it is always right.
In the case of a custom shape specification, you use the Decomposition Depth Slider to adjust how far you want the object to be decomposed to generate the physics representation. By enabling the "Visualize Physics Data" checkbox located at the top of the Physics Panel you can view the physics representation of the object and your game levels. Using the slider and the Test Build button, you can see how the various decomposition depths affect the representation. When you are happy with the result, you use the "Build and Publish" button to notify the physics server to use the physics specification you created for that object, automatically updating all instances of that particular asset to use the new representation.
See also: Physics tutorials
There are two potential solutions, one involving primarily art resources and one programming resources.
- Create a Helper Object
- Create a System that allows you to store custom offsets for objects
Both solutions require the implementation of a custom FX task to handle this new custom capability for the FX system. Create a Helper Object The art team can create simple helper objects that are invisible (no texture) and do not collide. Each helper object is named using a naming convention so you can retrieve them in script. A helper object is exported and you use that object to determine the starting point for your FX. Using (most likely) a prop bucket, you would instantiate the helper object(s) necessary to support the functionality of a particular weapon when you "equip" the weapon. In the illustration below, the invisible helper object is exported with its origin set such that the middle of the bounding box that would encompass the triangle is at exactly the point you want to use as the "fireballs come from here" point.
Using the external function GetNodeBoundingBox(), you can found the corner of the bounding box divide that x in half to find the point in space to as the source position. The helper object needs to of course be positioned coincident with your sword, with its origin adjusted to get the point of the triangle in the precise spot.
Create a System An alternative to helper objects is the creation of a system that is capable of creating and editing offsets for specific assets. The information could be stored as a part of your item specifications or in a totally separate data structure. Ultimately it would of course be nice to have an interface (perhaps using virtual stages in GUIs) that would allow your designers to easily specific these types of offsets for objects in your world. Extending the FX System Either of the solutions will require the implementation of a game specific FX task that knows how to use the information whether that is a helper object created by artists or data stored in a system created by your programmers. The actual task would function in many ways like the attacher tasks but knows how to determine the offset required based on the "weapon" used.
You can find examples of custom FX Tasks in the Emitter and WeaponTrails in Hero's Journey's game specific tasks in the Reference world.
Is it more efficient to render one object with many polygons, or many objects with few polygons?
- See also: General Counts and Budgets
The single object will be rendered more efficiently. Additionally, you will often find a single object will require fewer polygons for the same effect as multiple because you can get away with not having polygons on the backs/bottoms of things. The down side of course is that you give up some flexibility when you always make single objects for a scene.
For example, think about a classic alchemist's shop in any MMO there are tables, bookcases, jars, bottles etc. You could do the entire interior of the shop as a single asset, but then you have given up the ability to rearrange the room in any way. Going to the other extreme, you might model every single bottle separately operating under the theory that you gain maximum flexibility that way. We generally feel that somewhere between those two extremes is probably the sweet spot. So while we wouldn't recommend making each test tube a separate object, perhaps a bookcase with all of its contents makes sense. This allows your level designers to take advantage of the flexibility of which HeroEngine is capable to rearrange your StereoTypical Alchemist Shop (tm) to be slightly different, while remaining generally efficient to render.
Ultimately, its probably a decision for someone like an Art Director and/or Lead Technical Artist to decide where the line is for your game.
How do I attach a particle effect to a character/object?
If you want to attach a particle effect to an character or object you have four major options:
- Character Behave Commands (character only): HOLD Behave commands#HOLD to have the character animate as it it were holding the particle relative to a bone
- BoneTracker Nodes: BoneTrackerNode
- HSL script: You can write a script to get a perframe callback and reposition the particle relative to the character (less efficient than the first two, but allows for behaviors not possible using them)
- FX System: The fx system knows how to attach particles it creates using a combination of the above three methods
When I load an asset, why does it show different textures on everyone's computers?
The problem results from several different materials sharing the same name.
When HeroEngine loads a material, the properties for a given material are loaded for it. If an object that has the same material name is loaded again, it uses the properties from the initial load and it is ignored the second time around. This is done by design for efficiency.
So for example:
Material "02 - default" has diffuse map red.dds and set to mask then a second material is loaded with the same name material "02 - default" has diffuse map blue.dds and set to none (these properties will be ignored)
So if you have a lot of things named "02-default" this is why you'll see different textures in different areas, because the objects load in different orders with the same materials.
This will also trigger a material redefinition error in the error panel telling you what material and what property has been redefined.
Any material that is intended to be unique must have the same name. If a material is used across multiple objects and intended to be the the same, then it can share that name, but must have identical settings.
How can we create a glass effect, for windows?
Though HeroEngine does not have a specific effect for glass refraction, the following may work:
- Setup a HeroMaterial with a gradient alpha and set reflectivity to around 50.
- Add a small 128 or 256 cubemap to the environment settings.
It is not recommended to try for true reflectivity, since it is very resource-intensive, as compared to all the other things which may be more important for a game.
Another possible thing to try, is with a Vortex node:
- From the Create menu, chose Vortex Node. This is a screen effect that does some distortion.
- By modifying its properties, you can take out the spinning and change the normal map that goes on, to achieve a flat glass look.
However, the problem with this technique is that it will distort anything that gets close to it, regardless of if it is in front of or behind something. But it may be worth experimenting with.
For Hero's Journey, our artists have created effects using the above techniques to achieve dark tinted glass which is opaque with a small amount of reflectivity, from the cubemap. This is an efficient solution because it still allows you to block sight lines and lower the chance of "z fighting".
See also: Mirrors
Other troubleshooting questions
- Sky box does not move with character
- Character or asset appears black
- Animated asset not displayed correctly
- Character is collapsed into a knot on the ground
- Character is not visible
- character does not animate
- HeroMaterial redefinition error
- Speed Tree appears incorrect
- Object has checkered flashing texture instead of the correct texture
- Advice on how to plan for quantity of polygons, and texture sizes in your game