TLU/TK in a Digital Time

Posted on April 1, 2015, 7:05 p.m. by Bram Wiebe with tags: TK, Consultation, GIS

Indigenous people are deeply concerned about keeping their Traditional Land Use and Traditional Knowledge information secure. Avoiding putting this information into a digital format is not practical so the question facing indigenous people is how to minimize the risks of having potentially sensitive information in a digital format. The natural response for people is to keep this information close to them, but this natural impulse may not have the intended effect.

The body of thought behind data security, privacy and integrity is vast so this short post is intended to highlight some of the issues and is not an exhaustive guide. I'll focus on three areas:

  1. Data Security
  2. Data Privacy
  3. Data Integrity

Data security is primarily concerned with keeping your information out of the hands of people who should not have access but it also includes preventing its loss or destruction. The ways to address those concerns include:

  1. Physical Security - Putting the computer with the sensitive information in a physically secured and monitored space.
  2. Digital Security - Putting the computer in a managed environment where software is kept up-to-date so the computers and the information on them can only be accessed via secure protocols.
  3. Backups - Making copies daily both locally and off-site in the event of a fire or natural disaster.

Well managed cloud based services perform well in the area of data security because they have access to professional hosting staff and equipment to address these issues. Small organizations of all types have a much harder time meeting these stringent requirements. In smaller organizations, computers with sensitive information are often in someone's office where many people can access them. Also those same computers are used for web browsing, email and other activities which opens them up to many forms of attack. Lastly many small organizations don't have formal data backup procedures including off-site backups so natural disasters could potentially wipe out their information. 

Data privacy, in the context of having effective data security, is primarily concerned with valid users access to private information being limited by need. In evaluating options in the area of privacy some questions to consider would include:

  1. Does the system control access to information based on its privacy?
  2. Does the system generate warnings about reports that could be distributed by insecure means if they include private data?
  3. Does the system provide the means to hide personal identifying information?

In this area well managed cloud services do quite well, but the weak privacy controls in some social networking tools have raised doubts in the minds of many users. On this issue it is important to understand who is the service provider and what steps have been taken to protect privacy. For example LOUIS Heritage requires every mapped feature or non-spatial discussion be marked with a privacy setting which is then used to filter search results based on user permissions. Similarly LOUIS Heritage limits the inclusion of information about participants in reports to protect privacy. In contrast, desktop GIS systems if managed in on a file server tend to give either no-access or complete access which is too crude to properly address privacy concerns.

Data integrity is primarily concerned with information accuracy. When evaluating options from this perspective questions to be considered include:

  1. It is possible to specify a particular format for information?
  2. Does the information require review before it is used?
  3. How does the system prevent accidental deletions or edits?
  4. Does the system ensure that relationships between different pieces of information are maintained?

No desktop GIS solution has these capabilities out of the box. Version management tools or tools for managing the consistency of format exist for GIS environments but using them can be technically challenging and usually requires significant effort. Cloud based services perform variably on this front. Generic archive services do not address this issue at all because they simply store files without any knowledge of their contents. LOUIS Heritage however requires data meet certain format requirements and the data exists in a number of states from when it is first gathered to when it is reviewed and finalized. The use of states in LOUIS Heritage prevents finalized data from being deleted or edited accidentally.

It is possible for indigenous communities to put in place all the measures discussed here but in most cases it is financially and functionally impractical to do so. Cloud based services, like everything else in life, are not risk free, but when compared to files on someone's computer in an insecure office, the choice is clear. To contact us about how to better secure your local information or learn more about the LOUIS tools use the contact page here.

Why Service Based Solutions are Often a Good Idea

Posted on Sept. 19, 2014, 7:09 p.m. by Bram Wiebe with tags: Web Solutions

I have worked in the software and GIS industry for almost 25 years and just like fashion, things run in cycles and although the technology changes the cycles themselves are the mostly same. The classic cycle is one between centralized vs localized hardware and software. In the old days when computers were the size of buildings, the concept of the laptop or the smart phone was limited to science fiction. Eventually computers got smaller and soon everyone had their own personal computers but with the advent of the internet having everything connected again became popular and useful. The current trend is toward Cloud computing which for most ordinary folks means using centralized services. Many big and technically capable organizations like cities, school boards and medium to large companies are moving away from running their own mail servers and keeping multiple copies of office software up-to-date and switching to offerings from Google and others. Service based solutions are becoming increasingly popular because they save money, provide more flexibility for users and reduce the burden on the technical team for an organization.

At the same time many small organizations seem to want to have all their stuff on their own little server and in many cases this is not good choice. The most obvious case I've seen of this is many smaller companies, who are not even IT companies, deciding they need to develop their own special webGIS tool. The situation is very similar to one I saw about 20 years ago and I want to tell two stories to explain my concerns with what is happening today.

In the early 1990s when my career as a software application developer was just beginning, I was writing custom DOS based data management applications for clients using Clipper and an add-on called Zachary. Together these tools let me create reliable and effective basic data management applications very quickly. At that time there were not many commercial options to choose from and commercial software was quite expensive. At that point in time when I could create reliable applications quickly and inexpensively, custom applications made functional and economic sense for my clients.

When Microsoft Windows started to dominate the personal computing world, Microsoft released the Access database product which made it pretty easy for anyone to create their own database systems. Many horrible products were subsequently created by summer students or others. These poorly conceived applications would eventually break needing them to be updated or rewritten. In many cases, especially as more commercial tools became available, organizations would eventually buy a commercial tool and live with its limitations because they wanted something reliable.

These stories highlight four points which I'll expand on below:

  1. Custom software often doesn't save time or money

  2. In-house development needs long-term dedicated staff

  3. Just because something is easy to use, doesn't mean it is the right tool for the job

  4. Maintenance, data management and data protection over the long-term have to be part of your cost / benefit analysis

Custom software doesn't usually save time or money for many reasons but I'll highlight three. First, software development is expensive and robust software development is more expensive. A first or second year computer science summer student rarely has the skill, expertise or confidence to ask the important questions about business needs. So if the student doesn't understand the problem they can't create the right solution. The second reason relates to maintenance. If the software was written by an inexperience programmer who had limited time, the documentation will be minimal which makes maintaining or updating it more difficult. Third, software written for a single client doesn't get updated unless the client is willing to pay for the entire cost of the update. For commercial applications or services the cost of development and support is covered by the entire user community.

In-house development needs long-term dedicated staff to be sustainable. If an application is created by a single developer and that developer has not been given the mandate and time to create good and complete documentation, it is unlikely to get written. An organization dependent on a piece of software written by one person exposes the entire organization to significant risk if that person ever leaves or gets sick. If an organization has unique problems that commercial software or services can not solve and that organization has the resources to create an in-house development team, then in-house development might work. That in-house team however must get the required time and mandate to create good documentation and do robust testing in order for the in-house development to succeed. For a small organization who has a single technical staff person, in-house development for anything but simple problems is unwise.

Just because something is easy to use, doesn't make it the right tool for the job. When I'm doing renovation work on my house I don't try to figure out how to attach boards with duct tape just because it is easy or I happen to have a roll in my hand, I go and get some screws or nails to solve that problem. Software development is exactly the same; just because someone may know tool x doesn't mean it is the right tool for the job. Microsoft Access was a very powerful tool and useful for many things, but it wasn't the right tool for all problems; but when it was the cool thing to use, people tried to use Access for all sorts of things it wasn't intended for.

Maintenance, data management and data protection over the long-term have to be part of your cost / benefit analysis. This is especially true of custom software versus Cloud service comparisons. For example imagine looking at two options where the functionality is similar with the following costs and features:

  • Option 1 – $10,000, source code free, no backups, no updates, no support after 30 days

  • Option 2 – $2,000 / year, backups, updates, support

In this case, the non-technical person will often see the following:

  • Option 1 – $10,000 and it is mine

  • Option 2 – I have to pay forever

The non-technical person doesn't usually consider the time and costs associated backups, updates, security and support. In this example with Option 1, the client would pay more and get less. In addition to the ongoing costs, which may add up to the initial purchase price, the client will need to replace the server and perhaps the entire system in 5 years. With Option 2 the costs are clearly defined and predictable. The predictability of pricing, reduced responsibility and fewer staff are some the reasons why many big organizations, and some smaller organizations with strong technical leadership, are turning more and more to Cloud service based solutions.

To organizations thinking of building their own webGIS solution, they need to ask if they really need a custom solution or if there is a service out there that will do what they need.

For communities thinking about buying a custom software solution, especially in the webGIS space, they need think about the total cost over 5 years. To help with this they should at least ask the following:

  1. How is the system going to get backed up and who is going to do it? Where are off-site backup copies going to be kept in case the building burns down or the machine is stolen? What is the annual costs for dealing with these issues correctly?

  2. How is the server going to be physically protected from theft or tampering?

  3. Who is going to fix the system when it breaks and how much is that likely to cost every year?

  4. Who is going to keep the server software secure and the application software up-to-date and how much will that cost each year?

  5. How many staff are needed and what is the training and retraining costs likely to be when staff change jobs?

  6. What will it cost to replace the entire system in 5 years?

For organizations with an solid IT team, they will likely ask these questions before they approve any purchases and if they don't, they will be able to give good answers to these questions. If you are thinking about buying a system and you don't have someone in your organization asking these questions or able to answer them, then the custom solution is probably a bad idea and you should be looking for a service based option.

There are cases where custom software solutions are needed and there are some reputable firms that specialize in this work. Small organizations however need to understand that almost without exception, custom solutions will cost more and be more work to maintain.

I strongly encourage users to think carefully about long-term costs when selecting solutions so that they do get best value from their IT budgets.

Managing Attributes in a PostGIS Dissolve

Posted on April 30, 2014, 8:42 p.m. by Bram Wiebe with tags: Programming, GIS

For GIS programmers, PostGIS is a powerful spatial database, but the way that SQL works can cause some confusion.

When a layer is buffered and dissolved, it isn't always clear in PostGIS how to keep the attributes attached to the features they came from.

In the figure below we have six polygons, that we will assume have already been buffered. We can see that these polygons overlap in two groups:

Source Polygons

To do a spatial dissolve the ST_Union function is used as follows:

SELECT
    (ST_Dump(St_multi(ST_Union(the_geom)))).geom AS the_geom
FROM 
    source_layer;

This works but any connection to the feature attributes is lost. Trying to manage attributes in this original query is very complex and leads to unreadable and thus unmaintainable queries.

The easiest way to keep the connections between the dissolved areas and the original feature attributes is to add the connections AFTER the dissolve using ST_Intersects.

In this case we have the key field of 'id' so we will keep the lowest id for each spatial group and also keep an array list of ids that we can use for other joins later. The SQL is as follows:

SELECT 
    min(a.id) as id, 
    array_agg(a.id) as ids, 
    b.the_geom
FROM 
    source_layer a,
    (SELECT 
        (ST_Dump(St_multi(ST_Union(the_geom)))).geom as the_geom
    FROM source_layer) b
WHERE 
    st_intersects(a.the_geom, b.the_geom)
GROUP BY 
    b.the_geom;

The resulting table looks like this:

ididsthe_geom
4{6,5,4}0103...
1{3,2,1}0103...

The resulting layer looks as follows:

Disolve Result

So when you want to dissolve and manage attributes in PostGIS next time, hopefully this approach will be useful to you.

How Might TLU & TK Fit Into Economic Development?

Posted on Nov. 23, 2013, 8:11 p.m. by Bram Wiebe with tags: GIS, Traditional Knowledge, Web Solutions, Spatial Planning, Aboriginal Relations, Field Methods

I had the privilege of attending the Alberta First Nations Lands & Economic Development Training Symposium in Edmonton in November 2013 put on by CANDO (www.edo.ca). At the conference dinner, former National Chief of the Assembly of First Nations, Ovide Mercredi gave a keynote address. In his speech, Chief Mercredi spoke about how important it is for Canada's Aboriginal people to take initiative and be self reliant, especially in the context of managing their land. The MC for the evening related this idea to his childhood experience of being taught by his father about going out to meet the tide instead of waiting for it to come to them. I liked that image and began to think about how TLU & TK information can help Canadian Aboriginal groups meet the tide effectively.


Often when people talk about Traditional Land Use or Traditional Knowledge (TLU / TK), it is in the context of responding to an industry proposal. These reactive uses of TLU & TK ensures that the government or proponent set the terms of reference which in turn leaves many Aboriginal people feeling that their knowledge was and will be ignored.


It is clear that communities need to have ready access to quality and up-to-date TLU & TK information for dealing with day-to-day land management issues. With the advent of new mobile technologies it is now very easy for communities to start capturing use information with a relatively low cost ongoing program rather than relying exclusively on expensive and periodic research projects that often just result in paper reports that sit on the shelf and collect dust. We and others provide mobile tools and data integration services for TLU & TK but in the context of economic development more is needed.


Even with accessible quality and up-to-date TLU & TK information, communities need tools to manage those day-to-day interactions with industry and government land referrals so that their limited staff is not overwhelmed. A variety of land referrlas tools exist and we will be releasing a new tool in 2014 that will be very flexible, easy and affordable for communities of all sizes. Again however, from an economic development perspective I'm not convinced that land referrals tools together with community research tools are enough.

Outside my work with Aboriginal communities, one of the tools I use is a landscape optimization tool called Marxan. Marxan allows multiple competing land uses to be evaluated together and it provides a variety of solutions where all targets are met, but their impact on each other and the overall cost is minimized. Marxan has been used globally for more than 10 years, mostly focused on conservation but there are a few examples where it has been used to create solutions to meet both conservation and development targets. I have had a few opportunities to included TLU data with scientific and economic data in Marxan projects and it worked well.

From my perspective putting tools in place to gather TLU & TK data on an ongoing basis is important. Equally important is the ability to manage the day-to-day engagement process effectively. With processing fees for land referrals and the right tools, I believe those can be mostly self sustaining activities. Where I see the a need is in proactive planning. I am confident that proactive engagement and spatial planning would improve the efficacy of TLU & TK inclusion but also that spatial planning is an essential component of a community economic development plan.

Gathering Quality TK Part 3 - Using Mobile Applications

Posted on Oct. 2, 2013, 7:13 p.m. by Bram Wiebe with tags: Web Solutions, Spatial Planning, Field Methods, GIS, Traditional Knowledge

The smart phone has brought many new technologies to ordinary users including GPS. I think many GIS technicians do not see the smart phone as a GPS data collection tool, but it can be very useful for collecting GPS data IF its limitations are understood and it is used appropriately. In this blog I'll share some ideas on how this technology can be used to allow for broader and more cost effective gathering of high quality TK data.

Standard smart phone GPS works fine in areas with good cell tower coverage but once out of the coverage area it often doesn't work well. This is where Bluetooth GPS receivers come into the picture. The concept is simple, these devices, like the Dual XGPS 150A which I've been testing lately, provide a real GPS antenna and the user interface is the phone. I've found that my old iPhone 3GS transforms into something useful with this $100 CDN add on. For example, when using navigation while driving the phone is now fast enough that Google Maps can actually notify me to turn before I reach the corner. In field data collection tests I've done in valleys or areas with tree cover where assisted GPS is almost useless, my accuracy with this GPS receiver has consistently been +/- 10 metres or better. I've verified this accuracy by gathering data for known landmarks and checking my results on appropriately scaled maps.

With a smart phone and a $100 CDN GPS receiver you have a basic platform to collect field data for cases where +/- 10 metres is good enough. There are cases where high accuracy is needed such as land surveying, road building and so on. But there are many places where +/- 10 metres is just fine. Consider that most Use and Occupancy Mapping (UOM) studies use 1:250,000 or 1:50,000 maps. When mapping at that scale, if using a fine tipped pen and a paper map, the best consistently achievable resolution will be 1 to 2 mm on the map which translates into plus or minus 62.5 to 125 or 12.5 to 25 metres. A point gathered in the field with +/- 10 metre accuracy is a big improvement over a paper map point because it is more accurate and it has been physically verified.

Some people may object that if a community is going to invest in field visits, then spending another $2000 to $5000 per team on high end GPS technology is an appropriate expenditure. This can certainly be the case, but this argument generally presumes that field visits will be done with trained GPS operators going out in the field with land users. There are however many cases where you want to gather regular users data on a day to day basis.

Before I give examples of uses of regular user gathered GPS data I should mention the other main benefit of smart phones in data gathering. Tools for field surveys that come with high end GPS devices have in the past not been particularly user friendly. With the advent of mobile applications for smart phones users can choose from close to 100 different data collection apps on the major mobile platforms. Writing your own mobile app isn't likely to be necessary. These options are changing all the time but the point to be made here is that these tools allow project managers to create simple and easy to use forms that their users can fill out on their smart phones and have them send the data to you without visiting your office. This is a big advantage over having to physically retrieve the GPS unit when people come back from the field.

With a relatively easy to use interface on their phone, the participants will still need to be comfortable with a smart phone and be willing to spend half a day to learn how to use the GPS receiver. The program costs could be minimal because the payment could be the right to use the GPS receiver. Not only is this less expensive than a traditional map interview or formal field vists but the data is potentially richer as it can be continuous and provide time of day, date and year of actual use as well as location and descriptive information.

Possible programs could be asking users to record their activity and location when they are on the land or it could involve pairing young people with Elders. This second type of program would not only provide useful field data but could provide educational opportunities for young people both on using GPS but also to learn from their Elders. The relatively low cost of these tools allows for many other types of data collection such as tissue sampling in a monitoring study.

To the GPS technicians, this technology doesn't replace the value of high end GPS but allows communities to spend those dollars where they really count, such as with archaeological sites where accuracy really matters. Using community members in data gathering has the potential to increase community engagement and the resulting data can in turn be used to identify areas that require high end GPS mapping.

The smart phone with a GPS receiver is a powerful combination. Like in many other areas the involvement of ordinary people in data collection can be powerful if it is carefully planned up front so the data can be easily integrated and used. In our TK solution, LOUIS Heritage, we support the use of mobile applications for this purpose and allow site visit data including photos to be uploaded directly. I expect that other tool providers have or will do this soon because of a growing desire in Indigneous communities to take charge of their own information.