Accessing UV coordinates

Discussions concerning programming of SOFTIMAGE©
Post Reply
User avatar
Grubber
Posts: 165
Joined: 22 Jun 2009, 21:11
Location: Vilnius, Lithuania

Accessing UV coordinates

Post by Grubber » 13 Nov 2014, 22:34

Hi,

I am trying to access coordinates of selected UV in Texture Editor.
Could anyone tell me how can it be done using Python? Or maybe it is very complicated task?

Thanks!

User avatar
rray
Moderator
Posts: 1775
Joined: 26 Sep 2009, 15:51
Location: Bonn, Germany
Contact:

Re: Accessing UV coordinates

Post by rray » 13 Nov 2014, 23:50

http://softimage.wiki.softimage.com/sdk ... ibutes.htm

To assume "auto sync selection" is turned on would make it easier probably, you could just use the app selection then instead of the awkward string attributes that you get from the viewport attributes.
softimage resources section updated Jan 5th 2024

User avatar
Grubber
Posts: 165
Joined: 22 Jun 2009, 21:11
Location: Vilnius, Lithuania

Re: Accessing UV coordinates

Post by Grubber » 14 Nov 2014, 22:09

Thanks for the link, but I do not really know how to use those, as I am very fresh beginner.
I am trying to use

Code: Select all

Application.Selection(0).SubComponent
which returns, for example,

Code: Select all

cylinder.sample[161]
one vertex is selected in Texture Editor. How I should go from here? If it is possible.

Again, thank you very much ^:)^

User avatar
rray
Moderator
Posts: 1775
Joined: 26 Sep 2009, 15:51
Location: Bonn, Germany
Contact:

Re: Accessing UV coordinates

Post by rray » 16 Nov 2014, 13:10

You'll have go through object model. I think the actual UV coordinates are a ClusterProperty (usually named "Texture_Projection") on a SampleCluster , so from your sample index, you would have to look up the same index in this Property to get to the coordinates.
Not sure about the details right now, so I can't get more specific. If you're still on it I might get a chance to look a bit deeper into it during the week.

I think the best option is to look at what the Jester addon does (in your "align to edge" thread)
softimage resources section updated Jan 5th 2024

User avatar
Grubber
Posts: 165
Joined: 22 Jun 2009, 21:11
Location: Vilnius, Lithuania

Re: Accessing UV coordinates

Post by Grubber » 16 Nov 2014, 17:02

I was thinking the same thing about using index. Also I'll look into that plugin too. Thanks!

User avatar
Grubber
Posts: 165
Joined: 22 Jun 2009, 21:11
Location: Vilnius, Lithuania

Re: Accessing UV coordinates

Post by Grubber » 18 Nov 2014, 17:50

So, I actually managed to get the solution:

Code: Select all

app = Application

selectedPolyObject = app.ActiveSceneRoot.Children(app.Selection(0).Name)
UVs = selectedPolyObject.Material.CurrentUV
UVArray = UVs.elements.array
This will give a three dimensional array, where first element will be array of u coordinates, second - v, and third - w.

Now I have faced another problem, which is selecting a uv island, then knowing selected vertex. Any ideas?

Thanks

User avatar
Grubber
Posts: 165
Joined: 22 Jun 2009, 21:11
Location: Vilnius, Lithuania

Re: Accessing UV coordinates

Post by Grubber » 20 Nov 2014, 10:00

Ok, I am still with this.
After some time trying to select a uv island, then knowing selected uv, I decided to approach this in another way. First I select the island, and then invoke PickElement command, but this time I cannot manage to select an edge in Texture Editor. Maybe someone has any ideas on approaching that? I would really appreciate the help. :-ss

Thanks

User avatar
myara
Posts: 403
Joined: 28 Sep 2011, 10:33

Re: Accessing UV coordinates

Post by myara » 20 Nov 2014, 11:20

What I do is to set Texture Editor Preferences / Sync Method to Components, and selection filter to Polygon faces as a requirement (manually, unfortunately I've never found a way to change this option with scripting).

Then, when you select an island in the UV Editor, it will automatically select faces in the viewport. Scripting will only recognize the faces, so you'll need to convert those faces to samples and work with them.

I did a mirror UV tool a few years ago. So I needed to know the UV coordinates to calculate the UV bounding box and flip around the highest / lowest coordinate.

I think I've never finished it and I don't think I'll ever will, but so far it works so you can try it here:

https://www.dropbox.com/s/z9is3wbj68bxh ... UV.js?dl=0

Sorry it's in JScript, not Python. You can ignore the VBarray workarounds.
M.Yara
Character Modeler | Softimage Generalist (sort of)

User avatar
Grubber
Posts: 165
Joined: 22 Jun 2009, 21:11
Location: Vilnius, Lithuania

Re: Accessing UV coordinates

Post by Grubber » 20 Nov 2014, 14:24

Thanks, myara. ^:)^

I already figured out how to access uvs. Now I am on how to select an uv island, then I have selected sample, quest. :)

User avatar
myara
Posts: 403
Joined: 28 Sep 2011, 10:33

Re: Accessing UV coordinates

Post by myara » 20 Nov 2014, 15:21

Do you want to select an UV island from what? a UV selection? I can't think of anything fast enough to be usable at scripting level. Even worst if you are using Python.

UVs in Softimage are not merged, so if you use get the sample selection, the only way I can think of to get an uv island would be to iterate through all of them to check if they are in the same uv coordinate to asume they are "merged" and are in the same island.
M.Yara
Character Modeler | Softimage Generalist (sort of)

User avatar
Grubber
Posts: 165
Joined: 22 Jun 2009, 21:11
Location: Vilnius, Lithuania

Re: Accessing UV coordinates

Post by Grubber » 20 Nov 2014, 20:33

Yeah, I meant, selected UV.

I do not understand the concept of two uvs with the same coordinates. I think there are none such uv in the whole collection. In the attached image I tried to show what I mean. A and B belongs to the same island, but the coordinates are different.
Attachments
uv.jpg

User avatar
myara
Posts: 403
Joined: 28 Sep 2011, 10:33

Re: Accessing UV coordinates

Post by myara » 26 Nov 2014, 07:59

For each UV corner you have 2 samples. So, for each point (vertex) you'll have between 2 or more than 4 samples.
Samples are not merged.

In your attachment point A has 2 samples in the same coordinate.
Press the T button in your Texture Editor to see all samples.

Iterating through all samples would be extremely slow, and Python is specially slow in these tasks.
VBS would be faster but still slow enough to take a coffee break every time you use that tool with a fairly high poly object.

If you must continue with your UV tool project, I'd suggest to try the C++ API.
That is if you must. I wouldn't start learning a complicated API for a discontinued software.
M.Yara
Character Modeler | Softimage Generalist (sort of)

User avatar
Grubber
Posts: 165
Joined: 22 Jun 2009, 21:11
Location: Vilnius, Lithuania

Re: Accessing UV coordinates

Post by Grubber » 26 Nov 2014, 16:56

Thank you very much for the explanation. But I still do not see how to determine if samples at points A and B belongs to the same island.

Anyway I was hoping it would be not so difficult. And, for certain, I won't learn C++ API.

Thanks again.

mantom
Posts: 10
Joined: 21 Nov 2014, 07:55

Re: Accessing UV coordinates

Post by mantom » 07 Dec 2014, 18:11

You have to build a lookup map, then use it to find the siblings of your selected samples.

A 'sample' (aka Polygon Node in the case of polygon meshes) is Softimage's term for unshared vertex. Back in the day polygons were described individually, each polygon had a physical vertex for each corner of the polygon. In the case of a cube there would be 24 vertices (4 vertices per polygon x 6 polygons), 3 sharing the same physical position in 3D space at each corner. You can view this construction in Softimage by creating a primitive cube, selecting all edges, then choosing 'disconnect components' from the context menu. This was obviously a waste of data and not practical in the general case, so formats were re-engineered to 'share' vertices by referencing them as many times as needed. when more than one polygon referenced the same vertex, the vertex would become 'shared' fusing the polygons at that vertex. If all the vertices of an edge were shared, then so would the edge, and so on. When all vertices and edges are shared, you get the cube as we know it today.

While sharing vertices solves the shape problem, it creates a new problem in that each vertex of a polygon had (optional) additional metadata such as a normal, UV texture coordinate, and a color. If vertices are shared, what happens to this metadata? Well, that's where the sample comes in. It's essentially the original unshared vertex metadata. So for each physical vertex on the mesh, there will a normal, UV texture coordinate, and color value on each polygon joined at the vertex. The cube will still have 8 physical vertices, but 24 copies of the metadata. This allows each polygon on the mesh to have unique values separate from the neighbors. Otherwise you'd have color bleeding, averaged normals, and so on.

What you're asking to do is get a list of selected samples from the texture editor and find all other samples connected to them. To do that you must find which vertices own the selected samples, then traverse each of those vertices to obtain the samples' siblings, then do whatever it is you want to do with them.

The Softimage API is very good about getting access to components and neighbor components when you drill from top down through the mesh geometry structure (Object > Polygons > Edges > Vertices > Samples), but it's not as good when going the other direction (Samples > Vertices > Edges > Polygons > Object). What you need to do is drill from the top downwards and build a map of which samples belong to which vertices, then use that as your lookup table when going the other direction in your code to do whatever it has to do. Even for large meshes this can be performed fairly quickly.

Attached is an example plugin demonstrating how to build a sample --> vertex map, get the selected samples in the texture editor, lookup the selected samples siblings, then copy the UVW coordinates to the siblings. It's written in JScript, but shouldn't be hard to convert to whatever language you're using.

Matt


Code: Select all

//======================================================================================
// ML_UVWCopyToSiblings() v1.0 Matt Lind
//
// Given an axis, sample indices, and texture projection, copy the UVW values
// from the selected samples onto the sibling samples belonging to the same point.
// This operation is similar to a 'heal' operation.
//======================================================================================

//===================================================================
// Constants()
//===================================================================
function Constants()
{
	this.PRG = "ML_UVWCopyToSiblings";
	this.MENU_LABEL = "ML UVW Copy To Siblings";
	this.DEBUG = false;
}

//===================================================================
// XSILoadPlugin() - Registers plugin with XSI
//
//===================================================================
function XSILoadPlugin( oPluginRegistrar )
{
	var oConstants = new Constants();
	
	oPluginRegistrar.Author = "Matt Lind";
	oPluginRegistrar.Name   = oConstants.PRG;
	oPluginRegistrar.Email  = "";
	oPluginRegistrar.URL    = "";
	oPluginRegistrar.Major  = 1;
	oPluginRegistrar.Minor  = 0;

	oPluginRegistrar.RegisterCommand( oConstants.PRG, oConstants.PRG );
	
	oPluginRegistrar.RegisterMenu( siMenuTextureEditorToolsID, oConstants.PRG + "Menu", false, false );	// Texture Editor > Tools

	return( true );
}

//===================================================================
// XSIUnloadPlugin() - Removes plugin from XSI
//
//===================================================================
function XSIUnloadPlugin( oPluginRegistrar )
{
	return( true );
}

//===================================================================
// _Init() - Executed when command invoked 1st time after loading in XSI.
//
//===================================================================
function ML_UVWCopyToSiblings_Init( oContext )
{
	var oCommand = oContext.Source;
	
	oCommand.Tooltip     = "Sets UVW coordinates of samples belonging to the same vertex as the selected sample(s).";
	oCommand.Description = oCommand.Tooltip;
	oCommand.ReturnValue = true;
	
	// Add input arguments to the command
	var oArguments = oCommand.Arguments;

	oArguments.Add( "Axis", siArgumentInput, 0, siUInt2 );
	oArguments.AddWithHandler( "oSelectedSamples",    siArgHandlerCollection );
	oArguments.AddWithHandler( "oTextureProjections", siArgHandlerCollection );
}

//===================================================================
// _Term() - Executed when terminated.
//
//===================================================================
function ML_UVWCopyToSiblings_Term( oContext )
{
	var oCommand = oContext.Source;
}

//===================================================================
// Menu_Init()
//
//===================================================================
function ML_UVWCopyToSiblingsMenu_Init( oContext )
{
	var oConstants = new Constants();
	
	var oMenu = oContext.Source;
	oMenu.AddCallbackItem( oConstants.MENU_LABEL, oConstants.PRG + "Menu_OnClicked" );
}

//===================================================================
// Menu_OnClicked()
//
//===================================================================
function ML_UVWCopyToSiblingsMenu_OnClicked( oContext )
{
	var oConstants = new Constants();
	var oView      = oContext.GetAttribute( "Target" );
	
	// Get selected sample point(s) and active texture projection(s)
	// (returns results as strings)
	var SelectedSamplePoints = oView.GetAttributeValue( "selectedsamplespoints" );
	var SelectedProjections  = oView.GetAttributeValue( "selectedprojections"   );
	
	if ( oConstants.DEBUG ) {
		LogMessage( 
			"    Selected Samples: "   + SelectedSamplePoints +
			"\nSelected Projections: " + SelectedProjections, 
			siComment 
		);
	}
	
	if ( !SelectedSamplePoints ) {
		LogMessage( "[" + oConstants.PRG + "] No samples specified", siError );
		return( false );
	}
	
	if ( !SelectedProjections ) {
		LogMessage( "[" + oConstants.PRG + "] Texture Projection not specified", siError );
		return( false );
	}
	
	// replace ';' with ',' so strings can be converted to collections when _Execute() is called
	var SelectedSamplePoints = SelectedSamplePoints.replace( /\;/gi, "," );
	var SelectedProjections  = SelectedProjections.replace(  /\;/gi, "," );
	
	// Process the selected samples
	// We'll call _Execute() twice, once each for U and V 
	// (to simplify code and make it more modular for other tools)
	try {
		// U Direction
		var aArguments = new Array(
			siUDirection,
			SelectedSamplePoints,
			SelectedProjections
		);

		Application.ExecuteCommand( oConstants.PRG, aArguments );
		
		// V Direction
		var aArguments = new Array(
			siVDirection,
			SelectedSamplePoints,
			SelectedProjections
		);

		Application.ExecuteCommand( oConstants.PRG, aArguments );
	} catch(e) {
		LogMessage( "[" + oConstants.PRG + "] " + e.description, siError );
		return( false );
	}
	
	return( true );
}

//======================================================================
// _Execute()
//
//======================================================================
function ML_UVWCopyToSiblings_Execute( Axis, oSelectedSamples, oTextureProjections )
{
	var oConstants = new Constants();
	
	if ( oConstants.DEBUG ) {
		LogMessage( 
			" --- ML_UVWCopyToSiblings() ---" +
			"\n               Axis: " + Axis +
			"\n   Selected Samples: " + oSelectedSamples +
			"\nTexture Projections: " + oTextureProjections,
			siComment 
		);
	}
	
	if ( oSelectedSamples.Count <= 0 ) {
		LogMessage( "[" + oConstants.PRG + "] No samples specified", siError );
		return(-1);
	}
	
	if ( oTextureProjections.Count <= 0 ) {
		LogMessage( "[" + oConstants.PRG + "] No Texture Projections specified", siError );
		return(-1);
	}

	for ( var i = 0; i < oTextureProjections.Count; i++ ) {
	
		var oTextureProjection = oTextureProjections(i);
		
		if ( oTextureProjection.Type != "uvspace" ) {
			LogMessage( "[" + oConstants.PRG + "] Not a texture projection: " + oTextureProjection.FullName, siError );
			continue;
		}
	
		// Create maps to store temporary values for faster lookup
		var aSampleValueMap      = new Array();
		var aPointSampleValueMap = new Array();	
	
		// Get texture projection UVW values
		var oTextureUVCoordinateData  = XSIFactory.CreateGridData();
		oTextureUVCoordinateData.Data = oTextureProjection.Elements.Array;
		
		// Get selected sample indices in the texture projection
		var oElement = oSelectedSamples(i);
	
		if ( oConstants.DEBUG ) {
			LogMessage( 
				"----------------------" +
				"\nTexture Projection: " + oTextureProjection.FullName +
				"\n           Samples: " + oElement.Type + ClassName( oElement ),
				siComment
			);
		}
		
		var oObject  = oElement.SubComponent.Parent3DObject;
		var oSamples = oElement.SubComponent.ComponentCollection;
		var oPoints  = oObject.ActivePrimitive.Geometry.Points;
		
		// Record values of selected samples in a map	
		for ( var j = 0; j < oSamples.Count; j++ ) {
		
			var oSample     = oSamples(j);
			var SampleValue = oTextureUVCoordinateData.GetCell( Axis, oSample.Index );
			
			aSampleValueMap[ oSample.Index ] = SampleValue;
			
			if ( oConstants.DEBUG ) {
				LogMessage( "Sample[" + oSample.Index + "]: " + SampleValue, siComment );
			}
		}
		
		// Traverse geometry and mark points containing the selected samples
		for ( var j = 0; j < oPoints.Count; j++ ) {
			
			var oPoint = oPoints(j);
			var oNodes = oPoint.Nodes;
			
			for ( var k = 0; k < oNodes.Count; k++ ) {
			
				var oNode = oNodes(k);
				
				if ( aSampleValueMap[ oNode.Index ] != undefined ) {
				
					if ( !aPointSampleValueMap[ oPoint.Index ] ) {
					
						aPointSampleValueMap[ oPoint.Index ] = aSampleValueMap[ oNode.Index ];
						
						if ( oConstants.DEBUG ) {
							LogMessage( "Point[" + oPoint.Index + "]: " + aSampleValueMap[ oNode.Index ], siComment );
						}
					}
				}
			}
		}
		
		// Copy value(s) from selected samples onto unselected siblings belonging to the same point.
		for ( PointIndex in aPointSampleValueMap ) {
		
			var oPoint      = oPoints( PointIndex );
			var SampleValue = aPointSampleValueMap[ PointIndex ];
			var oNodes      = oPoint.Nodes;
			
			if ( oConstants.DEBUG ) {
				LogMessage( "Point[" + oPoint.Index + "]: " + oPoint.Index, siComment );
			}
			
			for ( var j = 0; j < oNodes.Count; j++ ) {
			
				var oNode = oNodes(j);
				
				if ( aSampleValueMap[ oNode.Index ] == undefined ) {
					if ( oConstants.DEBUG ) {
						LogMessage( "Node[" + oNode.Index + "]: " + SampleValue, siComment );
					}
					oTextureUVCoordinateData.SetCell( Axis, oNode.Index, SampleValue );
				}
			}
		}
		
		// Update texture projection UVW values
		oTextureProjection.Elements.Array = oTextureUVCoordinateData.Data;
	}

	return(0);
}

User avatar
Grubber
Posts: 165
Joined: 22 Jun 2009, 21:11
Location: Vilnius, Lithuania

Re: Accessing UV coordinates

Post by Grubber » 07 Dec 2014, 19:02

mantom wrote:You have to build a lookup map, then use it to find the siblings of your selected samples.

A 'sample' (aka Polygon Node in the case of polygon meshes) is Softimage's term for unshared vertex. Back in the day polygons were described individually, each polygon had a physical vertex for each corner of the polygon. In the case of a cube there would be 24 vertices (4 vertices per polygon x 6 polygons), 3 sharing the same physical position in 3D space at each corner. You can view this construction in Softimage by creating a primitive cube, selecting all edges, then choosing 'disconnect components' from the context menu. This was obviously a waste of data and not practical in the general case, so formats were re-engineered to 'share' vertices by referencing them as many times as needed. when more than one polygon referenced the same vertex, the vertex would become 'shared' fusing the polygons at that vertex. If all the vertices of an edge were shared, then so would the edge, and so on. When all vertices and edges are shared, you get the cube as we know it today.

While sharing vertices solves the shape problem, it creates a new problem in that each vertex of a polygon had (optional) additional metadata such as a normal, UV texture coordinate, and a color. If vertices are shared, what happens to this metadata? Well, that's where the sample comes in. It's essentially the original unshared vertex metadata. So for each physical vertex on the mesh, there will a normal, UV texture coordinate, and color value on each polygon joined at the vertex. The cube will still have 8 physical vertices, but 24 copies of the metadata. This allows each polygon on the mesh to have unique values separate from the neighbors. Otherwise you'd have color bleeding, averaged normals, and so on.

What you're asking to do is get a list of selected samples from the texture editor and find all other samples connected to them. To do that you must find which vertices own the selected samples, then traverse each of those vertices to obtain the samples' siblings, then do whatever it is you want to do with them.

The Softimage API is very good about getting access to components and neighbor components when you drill from top down through the mesh geometry structure (Object > Polygons > Edges > Vertices > Samples), but it's not as good when going the other direction (Samples > Vertices > Edges > Polygons > Object). What you need to do is drill from the top downwards and build a map of which samples belong to which vertices, then use that as your lookup table when going the other direction in your code to do whatever it has to do. Even for large meshes this can be performed fairly quickly.

Attached is an example plugin demonstrating how to build a sample --> vertex map, get the selected samples in the texture editor, lookup the selected samples siblings, then copy the UVW coordinates to the siblings. It's written in JScript, but shouldn't be hard to convert to whatever language you're using.

Matt
Thanks man, for the explanation. I will try to analyze your example. I really appreciate it!

Post Reply

Who is online

Users browsing this forum: No registered users and 35 guests