Jump to content


  • Content Count

  • Joined

  • Last visited

Everything posted by Chris.Huxley

  1. Dates for the 2016 TUFLOW User Workshops have been announced. In addition to the annual new features workshop aimed at bringing existing users up to date with the latest features, we are also providing computer based training at an introductory and advanced level. Brisbane • Introductory TUFLOW Training: 19th April • TUFLOW User Workshop: 20th April • Advanced TUFLOW Training: 21st April Melbourne • Introductory TUFLOW Training: 10th May • TUFLOW User Workshop: 11th May • Advanced TUFLOW Training: 12th May Sydney • Introductory TUFLOW Training: 24th May • TUFLOW User Workshop: 25th May • Advanced TUFLOW Training: 26th May Click the following link for more information and registration details: http://www.tuflow.com/Download/Training/L.TPS000.0_2016_TUFLOW_Classic_Workshop.pdf Best Regards Chris Huxley
  2. We are pleased to announce dates for the TUFLOW New User and TUFLOW Advanced User 2015 USA Workshops: New User Workshop: May 11th to May 13th Advanced User Workshop: May 19th to May 21st Location: Hyatt Regent (1209 L St, Sacramento, CA 95814) Please refer to the TUFLOW website for the workshop content and cost details: http://www.tuflow.com/Training.aspx?usbt Contact training@tuflow.com or chris.huxley@tuflow.com to register. I look forward to seeing you at the training. Chris Huxley TUFLOW Software Sales and Support (USA)
  3. Ensight is a 3D result viewing software developed by CEI Inc. Ensight can be used for a variety of post TUFLOW simulation result viewing and analysis tasks: 1) Viewing TUFLOW 1D/2D spatial time series datasets (xmdf) 2) Undertaking result dataset calculations 3) Extracting point time series results from xmdf datasets 4) Extracting time varying long-profile data from xmdf datasets 5) Creating animations Some basic user notes have been documented on the TUFLOW wiki: http://wiki.tuflow.com/index.php?title=Ensight_Tips Cheers, Chris
  4. HEC-RAS and FLO-2D model conversion tools have been uploaded to the TUFLOW download page: http://www.tuflow.com/Tuflow%20Utilities.aspx The user documentation for both utilities is provided in the TUFLOW wiki: HEC-RAS to TUFLOW: http://wiki.tuflow.com/index.php?title=HEC-RAS_to_TUFLOW FLO-2D to TUFLOW: http://wiki.tuflow.com/index.php?title=FLO2D_to_TUFLOW Cheers, Chris
  5. A new version of the arcTUFLOW tools (2015-01-AA) is available for download from the TUFLOW website: http://www.tuflow.com/Tuflow%20Utilities.aspx The updated toolbox includes a new tool,“Load Simulation Input Files”. It reads the tlf file and loads all shapefiles to the Map window. It also applies a common symbology to the shapefiles depending on the file prefix. User notes are documented within the ArcGIS section of the TULFOW Wiki: http://wiki.tuflow.com/index.php?title=ArcGis_Tips
  6. We are pleased to announce the next round of North America TUFLOW workshops will be run early next year. Two workshop courses will be offered for modelers of varying experience; new users and advanced models. For more information please refer to the training section of the TUFLOW website: http://www.tuflow.com/Training.aspx?usbt If you have any queries, please don't hesitate to email training@tuflow.com Best Regards Chris Huxley TUFLOW Software Sales and Support (USA)
  7. Hi David, You can control the length of the velocity vector by adding a scale factor entry (-sf) to your syntax. For example: %RUN% -mif -vector -grid10 -t99999 -sf0.5 "C:\mymodel_V.dat"
  8. A picture of the GPU and CPU results may shed some light on the problem. Looking at the flood surface results will tell you more information than PO plots. You say the GPU results are static throughout the entire event? From experience this typically means one of two things: 1) The cell at the PO location is dry (TUFLOW reports the ground level in the PO output in this instance); 2) Downstream boundary issues (drowning out the model). If your PO location is near the downstream boundary of you model it might be worth reviewing the boundary condition definition and inputs in you GPU model. Cheers, Chris
  9. Please ignore this post. The TUFLOW Support team are currently undertaking some tests on the TUFLOW forum. Regards, Chris Huxley
  10. It seems you have misinterpreted the multiple domain functionality of TUFLOW. Within a given 2D domain the zpt layer point spacing needs to correspond to the "Cell Size" command you specified earlier in the tgc file. If you are using zpts which have been created using to a different cell size you will find that the underlying topography in your model will be wrong. To check this, you can import the zpt_check.mif/mid file to verify that the zpt elevation data processed by your model matches the input data you are using. The easiest way to identify any irregularities in the process data is: 1) Create a DEM from the zpt check file elevations using vertical mapper. 2) Using vertical mapper, subtract the "check" DEM elevations from the DEM you used to initially interrogate your zpts. 3) If the resultant grid from the subtraction shows significant differences you will need to investigate what is causing this (ie. using zpts which do not match the cell size specified in the tgc) If you want to increase the model resolution in your area of interest you will need to set up your model using multiple domains, with each domain corresponding to a unique grid size. An example of this type of setup is provided below (from the tcf). If you want to see a working example of a multiple domain model, refer to Module 6 of the TUFLOW tutorial model (http://www.tuflow.com/Downloads_Tutorial_Model.htm) Cheers, Chris MI Projection == ..\model\mi\Projection.mif Start 2D Domain == East_Domain_20m_Grid_Res BC Control File == ..\model\east domain.tbc Geometry Control File == ..\model\east domain.tgc Timestep (s) == 10 Set IWL == 1 End 2D Domain Start 2D Domain == West_Domain_10m_Grid_Res BC Control File == ..\model\west domain.tbc Geometry Control File == ..\model\west domain.tgc Timestep (s) == 5 Set IWL == 1 End 2D Domain Start Time (h) == 0. End Time (h) == 12.
  11. I'm assuming you are talking about you MapInfo result grids here. If you look at your results in SMS you will see that the grid size in your finer domain area will be represented correctly as 2m. If this is the case, you can define the grid size of you MI result grids to the finer resolution using the TUFLOW_to_GIS.exe. By default (if a grid size is not specified) TUFLOW_to_GIS.exe will set the grid size of you post processed results to half that of the first active domain in you tcf file. In your case, I'm assuming this is the 10m grid domain. If you add a "-grid" command to your TUFLOW_to_GIS command line you can specify the grid size of the post processed results. For example, the following command will create an asc file of the peak flood levels from my model simulation "x_h.dat" with a grid resolution of 2m. TUFLOW_to_GIS.exe -asc -grid2 -t99999 -x_h.dat Hope this helps
  12. Are you using the double precision version of TUFLOW? If not, you should be. If using the double precision verison of TUFLOW doesn't fix your problem, the following Forum posts may give you some pointers. http://www.tuflow.com/forum/index.php?show...direct+rainfall
  13. I also agree strongly with PHA, when you are doing direct rainfall modelling, you should always be using the double precision build of TUFLOW. http://www.tuflow.com/Downloads/Releases/T...9-07-AF-iDP.zip
  14. You can process the date/time data in excel using the concatenate function to join the datasets. I've attached an example which uses the formats you provided, converting them into model time in hours. If you have lots of bc files that will need this manipulation, writing a small macro in excel will save you a lot of time... Once you've processed the data you will want to reference the "model_time" column in your bc database, not the "date" or "time" columns. Excel_Concatenate_Eg.xls
  15. I've also had this problem in the past where I accidently deleted some of the labels. There are two ways to get around this. You can open the WOR file in a text editor (Ultra Edit), find the map window you are working in, identify the layer which you have labeled and delete the label "object" commands. Deleting the object commands will return the labels to there original locations. Probably try this on a copy of the workspace file first, in case you delete the wrong info by accident. For example: Layer 4 Display Graphic Global Pen (1,2,0) Brush (2,16777215,16777215) Symbol (35,0,12) Line (6,2,16736352) Font ("Arial",0,9,0) Label Line Arrow Position Above Font ("Arial",257,9,0,16777215) Pen (1,2,0) With Value Parallel Off Auto On Overlap Off Duplicates On Offset 2 Visibility On Object 162 Line Arrow Anchor (521870.47671086306,6762329.8955717944) becomes Layer 4 Display Graphic Global Pen (1,2,0) Brush (2,16777215,16777215) Symbol (35,0,12) Line (6,2,16736352) Font ("Arial",0,9,0) Label Line Arrow Position Above Font ("Arial",257,9,0,16777215) Pen (1,2,0) With Value Parallel Off Auto On Overlap Off Duplicates On Offset 2 Visibility On This is handy if you have lots of workspaces which need changing that require the same fix. Otherwise, if it's just one workspace you're working with remove the problem layer from the map window and then re add it. You'll have to go through the process of setting the label properties for the new layer, but at least the labels will be in the right spot. Chris
  16. Post TUFLOW build 2006-06-AA there is the option to model a varying mannings roughness based on two defined depths. TUFLOW interpolates the roughness between these depths. Refer to pgA-19 and A-20 of the TUFLOW manual (Section A5 Bed Resistances Commands (.tcf): Read Materials File == <file>) TUFLOW currently cannot accomodate for more than two defined variable roughness depth values. [bill Syme: This is only correct for builds prior to 2009-07-AA - please see next post for information on setting up n vs depth curves for materials.] Cheers, Chris
  17. To check how much RAM is being allocated to the 1d sections of the model, search for the following statement in the tlf file: "Total 1D domain memory (RAM) requested" Assuming it is a RAM issue, while your looking through the tlf file, it would probably be worth also searching for the following: - "Total Memory requested thus far" - "ESTRY nodes to be created" If you are creating a model which has multiple 2D domains, I would also recommend you include the following command in the tcf: "Reveal 1d Nodes == ON ". This will allow you to see the 'hidden' 1d nodes that are automatically generated when the 2D/2D connection is being processed. Using this command will included the data from the generated 2D/2D connection nodes in the eof and model check files. If nothing jumps out at you by checking the above things, just forward me the model in the new year and I'll see what I can do. Chris
  18. Hi Scott, There shouldn't be a limit to the number of nodes which TUFLOW can process unless the preallocation of RAM for the 1d section of the model exceeds your computer specs. Can you please make a copy of the model and email to support@tuflow.com. If the model is too large to email, let me know and I'll send you an ftp site to upload to. Cheers, Chris
  19. If you've already set things up using the zshape triangulation and are finding the pre-process time a bit long, you can avoid this time issue in future simulations by doing the following: - Include the write check files command in your tcf (eg Write Check Files == C:\jb9999\tuflow\check\2d); - Run your simulation with the zshape command; - After your check files are written, import your zpt check file; - Copy the zpts from the check file for the area in question to a new file with an appropriate name; - Replace your zshape command with a "Read MID Zpts == filename.mid" (make sure this command is located below the initial base geometry zpts command in your tgc file). Good luck Chris
  20. One of the powerful features of TUFLOW is the ability of TUFLOW to allow you to update you model geometry incrementally. The golden rule here is, TUFLOW updates it's geometry sequentially. ie. commands occurring after preceding commands will overwrite the preceding geometry. Hence when you are updating a certain part of a model (eg for a road development), you will only need to update the geometry (eg using zpts, zlines and materials) in the area of the development, not replacing the zpts for the entire model. When I am setting up my tgc I do it in the following way. 1) Set up the model bounds for the full 2D area 2) Isolate the active domain via the code polygon 3) Set up my base geometry and material layers for the full active domain 4) Refine the base geometry using features such as zlines to represents things that cannot easily be represented using the base zpts (eg levees) Once you have run the model, import your zpt check file. To write the check files you will need to include the following command in the tcf "Write Check Files == (path where you want the check files to be written)" Once your base model is working well I would create a new model to represent the developed case. To do this you simply need to update you existing tgc file with the developed case geometry features (remember to include these command after the base case geometery commands). Once again, import you zpt check file for the developed case to ensure the developed case geometry is being represented correctly. As a further tip, I usually create a TIN of my zpt check file using vertical mapper, it's an easier way to see the geometry than looking at a series of points. I've attached two example tgc files which should help. Good Luck Pre_development.tgc Post_development.tgc
  21. From the sound of it you are basing your roughness for the area in the lake on a value relative to flood waters flowing on a bed of water? If this is the case it is probably not the correct approach. When I am modelling lakes I apply a manning roughness based on the vegetation in the lake using the initial water level command (Set IWL command) to set the standing water level in the lake. If, for example, the lake is relatively sparsely vegetated (possibly with some emergent macrophytes around the edges) a Manning's roughness of 0.03 may be applicable. This is something you can check when you are calibrating your model. Chris
  22. This is quite a bizarre one. I am not sure why it works but it seems to do the job. Firstly, the command line you are using is correct. I have come across the same problem when I have created a header.mif file and located it in the same location as the my_model.h file when you have a batch.bat file located in an external file. Similar to you, after running the batch file, I have scrolled through the DOS window to see that the header was used (which it was) only to find that it was not applied to the mif file??? After some playing around I found out that if I place an additional copy of the same header.mif file in the same folder location as the .bat file the created .mif file now is set to the correct projection instead of non-earth coordinates. Scrolling through the DOS prompt shows that TUFLOW_to_GIS doesn't make any reference to the header file in the same location as the batchfile, but it seems to work. The attached figure may clarify what I am trying to explain. Alternatively, you can manually change the header line in the .mif file to the correct projection using UltraEdit. You can do this one by one or (if you have to do this step numerous times) globally for all mif files in the specified folder. The global change is done using the "Replace in Files" command under the "Search" menu in Ultra Edit. To do this 1) Enter the entire header line as the Search term 2) Input the text to replace the headerline "your required projection". 3) Select file type (.mif) 4) Select the directory in which Ultra Edit will update the file. WARNING if you are using this method Ultra Edit will replace the search term in all .mif files in the defined directory. If you deliberately have .mif files in this directory using non earth cordinates do not use this "global" search and replace method. Hope this helps/makes sense. Chris Huxley
  • Create New...