dap2arc icon indicating copy to clipboard operation
dap2arc copied to clipboard

Create layer properties dynamically based on variable range

Open rsignell-usgs opened this issue 12 years ago • 3 comments

Now that we have the ability to select variables, it would be great if the properties of the layer could be dynamically determined by the range of the data selected. The default layer can be pretty ugly: in the image below I just selected salinity, leaving the default URL, time step and model layer, and this is what I got: 6-20-2013 6-19-12 am

rsignell-usgs avatar Jun 20 '13 10:06 rsignell-usgs

Agreed, this would be great. The main limitation I see is that Layer symbology support is still limited in ArcPy, this from the docs:

Depending on the symbology type, a layer's symbology can be modified. There are a limited number of supported symbology types for which properties and methods are available. It is good practice to first test the layer's symbologyType property. If a value of OTHER is returned, then the layer's symbology can't be modified. If the value returned is not OTHER, then the layer's symbology property will return one of the following symbology classes, each with their own unique set of methods and properties: GraduatedColorsSymbology, GraduatedSymbolsSymbology, RasterClassifiedSymbology, and UniqueValuesSymbology.

So, if we can swap things to a raster layer, we can do a reclassification on the fly with the data, but the same method doesn't look to be available with the TIN. Alternatively, we can fire up ArcObjects and make the changes from there. As an example, here's a C++ snippet that I wrote recently for a project that is primarily a Python toolbox: geodesic.cpp.

Another option would be generating a pre-classified output, so that the classes are known, and we just visualize everything with those default classes. Any of these three options sound particularly appealing?

scw avatar Jun 21 '13 08:06 scw

Since we've gone to the trouble of loading the native grid, we don't want to reduce to raster for visualization, so I'd guess I'd prefer option 2 (the ArcObjects solution) or option 3 ( a dictionary matching variable info with a bunch of precomputed layer files. ). I guess 3 would be the easiest, but kind of cumbersome. I'm not sure what Option 2 entails. Basically supplying a binary that can be called from arcpy?

rsignell-usgs avatar Jun 21 '13 13:06 rsignell-usgs

In terms of raw time, option 3 is likely faster -- we could gin up 20 or so layer files which cover the ranges of variables pretty quickly, and determine the correct one to use based on the value range.

Option 2 gives more flexiblity, we could dynamically control the elements as needed, and even expose options about the classification from the tool. The code is all in C# or C++, and the end use gets a DLL file with the specific functions, that is called in Python using ctypes, something like this, where dll_path is the DLL location:

try:
    loaded_dll = ctypes.cdll.LoadLibrary(dll_path)
except Exception as e:
    msg = "Failed to load high-speed geodesic library."
    utils.msg(msg, mtype='error', exception=e)
    return None
fn = loaded_dll.CalculatePairwiseGeodesicDistances
fn.argtypes = [ctypes.c_wchar_p, ctypes.c_wchar_p]
fn.restype = ctypes.c_int

From there, you can call fn(first_param, second_param) and it'll get passed down to the C++/C# code as needed, which can do everything that ArcMap can do, once you dig through enough classes.

scw avatar Jun 21 '13 22:06 scw