39 Coachwood Drive - Wollongong City Council

ClientWollongong City CouncilLocationAustralia, NSW
ConsultantWollongong City CouncilContractorJason Cooper
Websitehttps://www.wollongong.nsw.gov.au

39 Coachwood Drive

 

The Challenge

A design project was raised to solve a localised flooding problem. The estimated value of the construction was quite low, so the available resources for design were fairly slim.

The hydrology and hydraulics associated with the flooding and the proposed solution were already complex, and once the flow modelling challenges were paired with site specific building constraints, the team realised that to be confident in the performance of a design for this site, they would need complex and iterative hydraulic modelling.

Not having a budget to match the desired level of analysis, the team set out to find a way to make the analysis fit the available budget. The result far surpassed anything they had initially hoped for.
 

The Solution

The design project which spawned the subject process aimed to develop a low cost surface swale in place of existing barrier kerb. Visual inspection of the site identified this quite readily as the premium option to address frequent shallow depth flooding with approach flows arriving from multiple paths. The real challenge was how to demonstrate (quantitatively) that it would work.
The hydrologic and hydraulic aspects of this design were a little more complex than those usually associated with design of a grassed swale.

Approach flows arrived with relatively high energy from several different directions, with corresponding different response times and approach velocities in each sub-catchment branch. Temporally staggered peaks of runoff are good for keeping the quantum of peak outflow down, but bad for using simplistic hydraulics to figure out what is happening in a pond at the point of confluence, especially when the pond occurs over fairly flat terrain.

Proposed changes in design terrain levels were comparatively very slight (as little as 50mm) because of depth constraints posed by buried utilities and restrictions on ground level gradients associated with the design area being in regular use as a waiting area for a school bus stop.

Peak flow velocities in the design swale needed to be fairly tightly controlled because the most desirable surface treatment was turf, and because there was an identified potential for flash flood flows in the swale to sweep small children into the creek.

The area is maintained by local residents, so the bank gradients where the swale discharges to the creek needed to be gentle enough for maintenance by a push mower. Flow energy at the confluence of creek and swale flows also needed to be monitored to avoid scouring velocities or turbulence over inadequately protected banks.

2D flow modelling of catchment runoff and flows over a design surface was identified as the most efficient and reliable means of demonstrating that the constructed works would solve the problem within the constraints (1D modelling of swale flows with staged peak inputs from each catchment branch would involve a considerable amount of guesswork, especially with regard to coincidence of peaks and velocity where outflows are turned to run down the creek bank).

The identified need for rigorous modelling aside, as with a lot of small value capital projects in local government, the scale of the construction dictates the scale of investigation and design, and this was always going to be a fairly low value construction.

Being a low budget job and having complex hydrology and hydraulics put this design into the ‘take a good guess, build it, and hope for the best’ category. The ‘relatively fine’ changes in surface flow regimes and the constraints particular to the site put this job up another notch to the point where the guess becomes a bit too fuzzy for comfort and this made it a good pilot case for developing a way to cheaply apply complex and rigorous hydraulic modelling.

Moreover, this design presented a particularly suitable test case because with the finished swale being a fine balance between conveyance of design flows, permissible velocities, and constraints on excavation depth and swale gradients, development of a suitable finished surface model was almost certain to take several iterations of 2D modelling and terrain editing.
In an effort to find ways to increase design confidence on low budget, hydraulically complex design jobs, a process was developed to streamline the collation of catchment terrain data with edited design surfaces and the creation of output DTM files compatible with ANUGA 2D hydraulic modelling software (see http://sourceforge.net/projects/anuga/). The process needed to meet the following objectives:

  • Be really efficient (in order to afford to apply it on small jobs)

  • Be easy to implement, repeatable, and adaptable to other projects

  • Allow for simple, direct comparison of existing and design hydraulic models

  • Provide higher DTM resolution in areas where hydraulic analysis is touchy

  • Provide lower DTM resolution over gross catchment to reduce processing load


The process evolved over the course of this project. Much of the information below focuses on a single key aspect of our process, which touches on the subject of ALS normalisation.

The core theme of Council’s process is the use of a single software platform as a central hub where data is collated, edited, formatted and otherwise massaged to achieve an easy exchange of information between spatial models and hydraulic models, and to maximise efficiency of both hydraulic analysis and terrain editing (of design surfaces) without sacrificing data integrity or parity between models.

To anyone who has done some work in numeric modelling on datasets drawn from several sources, some of which are (or will be) manipulated on the basis of modelling output, the value of a single point of collation, editing, and formatting is difficult to overstate. In the subject process, further value is added by taking advantage of the powerful data formatting and visualisation tools in 12d Model software. In essence, the benefits of the process go beyond making the exchange of data faster, easier and safer and extend to actually improving the performance of the hydraulic modelling platform and improving the flow of the design and analysis cycle.

A significant deterrent to using 2D modelling for design purposes (especially for small scale designs) has always been the amount of time required for a model run—not only the fiscal cost associated with that time, but the more subjective hassle of losing mental momentum.

Designers like an analysis that happens while you check your email or grab a cup of tea—coming back a couple of minutes later and voila! Designers don’t like models that say ‘set me up, hope you haven’t forgotten anything important and come back tomorrow to see how it went’. Upon returning the next day, those great ideas can seem fuzzier and less flexible, and the way forward around any setbacks the model has highlighted much less clear. A key objective for Council’s process on this project was to reduce the hydraulic model runtime–not so their computers could get more rest while we were out of the office, but so the whole design process could accommodate complex modelling and stay dynamic, streamlined, and idea driven.

When doing terrain based 2D flood modelling, there are a few ways to get model runtime down. For just a comparative analysis, it’s easy—simply apply heavy handed filtering. Less points equals less cells equals less operations equals less runtime. Sometimes, however, deterministic modelling is required (i.e. if the result needs to approximate a real world situation). For instance, if it is necessary to demonstrate that a shallow swale is going to work with estimated approach flows rather than demonstrating that it is an improvement over existing conditions (in the lab environment).

Reducing the processing load for deterministic modelling requires a little more ‘finesse’ because it is necessary to reduce the number of operations, but keep those which are most representative. There are also a few ways to do this. Council’s process is geared toward medium sized datasets and focuses on reducing the processing load in areas where they need a quantitative estimate of runoff properties rather than a good representation of flow behaviour.

The hydraulic model lubricating part of the process aims to cull point density in the sub-catchments while retaining a representative terrain surface for modelling the catchments response to rainfall. In areas where existing and design features need more rigorous analysis, the point density is kept relatively high, so they still use the hydraulic model’s capacity for detail, but only where they need it.

Another approach they have tried to address 2D model runtime is to run a separate model to generate catchment flows then introduce inflows (derived from the catchment model run) into a smaller area model around the design site. On this job, they applied that approach partially in that we derived mainstream creek flows from a separate flood study and introduced these at the boundary of the 2D model. For sub-catchment areas contributing to the design site they needed more confidence in the timing and properties of runoff so these areas were included in the 2D model (hydrology based on a rainfall input).

Where detailed appraisal of inflows is crucial to design confidence, and design runs are likely to be iterative, modelling the critical parts of the catchment with tailoring of terrain resolution levels to reduce runtime is in most cases easier and safer than collating static inflow data for each model run because all the judgment calls, data massaging, and integration of data streams happens up front, and, if this is done in 12d Model, you can ‘see’ what you are doing, which helps prevent mistakes. Once the input data is sorted to a more manageable level, the only data management practice for each model run is to make sure you are combining the massaged catchment terrain with the right edits data. From the existing conditions run to any number of design checks it is ‘apples vs apples’, so you don’t increase the risk of blunders by increasing the number of model iterations.

The ALS model used to define sub-catchment terrain for the 39 Coachwood Drive project contained 1,598,284 points after it had been fenced to exclude any points outside the watershed and filtered to a z tolerance approximately double the design z accuracy of the source data.

A TIN was made from this cloud of points then the TIN was used to make smoothed contours at 2m intervals (go 12d Model!). The contour model had 367,571 points (23% of ALS). Another TIN was then made from the contours which were made from the TIN which was made from the ALS. A grid DTM with 2m point spacing was created from the TIN, which was made from the contours which were made from the TIN made from the ALS. The grid DTM had 195,750 points (12% of ALS).

Setting aside the difference in point volumes between existing and design features in the area of interest (not much in the scope of the whole model), then the rest of the hydraulic model is now running almost ten times faster. Bearing in mind that the contour interval was set at 2m, we ask ‘do we really need curves that tight in our hillsides?’ Unless we are dealing with a very steep catchment with tightly defined watersheds between sub-catchments, the answer is that we need tight vertical curves...which has a serious impact on catchment response. The horizontal curves, however, can be somewhat more relaxed in most catchments. Applying this notion, the team filtered the penultimate dataset (the 2m grid DTM made from contours) to exclude points within 6m of each other in the horizontal and within 1m of each other in the vertical. This was aimed at shedding another 50-65% of points whilst retaining the representative slope and watershed boundaries of the catchment. The final dataset, a filtered grid DTM made from a TIN made from contours, made from ALS had 68463 points (4% of ALS).

It is possible to conduct a similar operation numerically. Council declared that the beauty of doing it with 12d Model is the ability to ‘see’ what you are doing. They were able to produce comparison shots of TINs developed at each stage of the process. Any ‘lumps and bumps’ were progressively smoothed out with some averaging below the source line and some above to produce a normalised surface right through the middle of the source data surface, but with far fewer cells waiting to suck up the hydraulic calculations. By checking the progressive TINs in plane or flow direction view in 12d Model, they were able to observe that the cells which get culled are groups of smaller pointy triangles with similar aspects. Those were the cells they wanted to aggregate by averaging.

At this point, the team was very pleased that their 2D model took 15 mins instead of 6 hours...however, there were concerns that they had ‘messed with the source data’. They had made a smoother catchment, which would increase the catchment response, possibly resulting in a more conservative estimate of peak flow conditions at the outlet. To decide whether that was acceptable, they needed to consider the effect that ALS normalisation has in light of typical hydrological practice. If the design model were composed of several sub-catchments modelled with simplistic hydrology, then for each catchment they would be taking an average line up the middle and applying a roughness to it. The applied roughness value is related to the land use and is assumed to take account of surface features consistent with that land use. For the 2D model they also apply a roughness related to the land use. In both cases, the logic underpinning selection of roughness values assumes a flat or reasonably smooth gradient describing the average shape of the catchment. Applying this roughness to an already lumpy terrain surface is effectively doubling up on the roughness to some extent. They also needed to keep in mind that the lumpy terrain came from ALS, so each data point had a varying level of accuracy to begin with.

 

The Result

By normalising the ALS data they were not only reducing the model runtime, they were producing a catchment surface much closer to the theoretical ideal of the average catchment surface to which the selected roughness factor was applied. There will always be exceptions, but for most cases in most catchments, this technique will make for a more realistic simulacrum of real world conditions and for a better level of concurrence with other modelling methods.

Having initially set out to make rigorous hydraulic analysis a little more achievable on small scale designs, by the end of the pilot project they were tending toward a reserved satisfaction with the result. As well as making complex hydraulic modelling more obtainable for smaller projects, the process also improves parity rates between surface models in 12d Model (from which construction data is taken) and surface models used for hydraulic analysis.

The ease and speed of the process helps designers avoid the trap of making a little ‘manual’ edit here and there (to meet physical or hydraulic constraints) and ending up with construction drawings that vary substantially from the hydraulic model on which the cost/benefit case is hung.

Download this Case Study as a PDF HERE