Developments in FTTH/FTTx network design have steered towards a higher reliance on open-source data. While this data is attractive to save time and costs, usage beyond the strategic planning levels of network design are perhaps not (yet) advisable. Here’s why:
The capabilities for network engineers have grown in recent years with the rise of open data in the form of high quality satellite imagery, street view and openly available maps with city topology.
Engineers can now use georeferenced basemaps under their drawings in CAD or in GIS, use street view as a ‘site survey’ tool to quickly check the surroundings of the drawing, and check the location of streets and buildings using open maps, of which the data can also be imported and used as entities in a GIS system.
Because of its often readily available nature, there is a trend towards going beyond using the data as a tool: using open data as actual input data for the network. Because open data is usually standardised, automatic functions can be created to combine input data with the network topology/ruleset to create an FTTx network on the fly.
Such automated networks are very quick and cheap to create and can on the outset give good looking results, but likely do not lead to accurate material lists, and do not result in designs suitable for actual network construction. In the end, the engineer will still have to draw manually, and much of the automatic work will have to be redone.
In fact, reliance on this data is not without drawbacks. For starters, not all countries have open data available, so in these countries you would not be able to perform ‘business as usual’, which could lead to difficulties. And even in countries with data, vital information such as the number of Fibre Termination Units (FTU) or house numbers are not present in identifiers of home entities. This would leave planners and engineers in the dark if not combined with other data.
Another issue is that this kind of data has not been verified and is thus in many cases not up-to-date or accurate. Of course, this means when relying on open-source data it could more often than not lead to an incorrect design. Especially so when on-site surveys are also left out of the engineering process (more on that later). To further complicate matters, open data is usually processed in GIS-based environments, which usually do not lend themselves very well to customising a design to be construction ready (a low-level network design).
It is for these reasons that we suggest that reliance on open data should be limited to strategic/feasibility studies. Beyond basic high-level designs, the ‘old-school’ ways of FTTx network design are still important: accurate input data and verification in the field using site surveys are needed.
Accurate data may not always be available, but in most countries, one can eventually gain access to cadastral data and data on existing infrastructure such as ducts, poles and pipelines. Such data is vital for a workable design as negating existing infrastructure will inevitably lead to issues. These could be related to permits, to deviation requests sent by contractors that could lead to additional costs and planning problems and even to accidents happening in the field with for example gas lines.
Even with accurate data there will always be uncertainties in the design, and site surveys are important to root these out. They are time consuming, but with modern technology (tablets and mobile CAD software) can be performed quickly.
ITSimplicity Solutions BV believes that at this moment open data should, like automisation, be considered as just a useful tool and not the means to an end in itself. It’s useful to achieve either a very basic (strategic) idea of the target FTTx area, to help the engineer during drawing, or to assist in creating the basic network input data.