Demo session on LINZ Data Service APIs
[Ben Reilly, Relationship Partner - Sector Engagement]
Kia ora everybody. Welcome to this How-To session today and thanks for joining us and welcome to everyone who's any new participants in the Aotearoa Property Data Network. And to any repeat customers, welcome as well. So today, as you will have seen on the registration, we've got a How-To session about the using APIs from the LINZ Data Service. And it's going to be presented by James O'Brien here from our Open Data and Reuse team.
And yeah, can we just in terms of the running of the meeting today, can we just hold off questions until the end and then there'll be time allowed for some Q&A at the at the back end of the session. Otherwise I will do a run through Karakia and then we'll press on.
Whāia te mātauranga,
kia mārama, kia tupu, kia tiaki ngā whenua, ngā moana, ngā arawai
Kia whai take ngā mahi katoa Aroha atu aroha mai, tātou i a tātou
Toi te kupu, Toi te mana, Toitū te whenua. Haumi ē, hui ē, tāiki ē!
And that's just to say that we pursue knowledge for understanding, developing and caring for the lands, bodies of water and waterways, seek purpose in all that we do.
Let us show respect for each other. Hold fast to our language, hold fast to our spiritual strength, sustain the land, gather and go forward together. And with that, I will stop presenting and hand over to James.
Thanks, Ben. Hi, everyone. I'm James O'Brien. I'm on the Open data and reuse team. So the most common thing that we manage that people are aware of would be the LINZ Data Service, but we also have a lot of interaction internally with mapping, data analysis, that sort of thing. So normally with these kinds of demos, we've got quite a lot more time and usually face to face.
So going over things a little bit differently today instead of going through like the PowerPoint presentation and stuff, there's just not enough time. So we're just going to rip into a pseudo live demo of some of the more common things we get asked about the API services that are provided through the LINZ Data Service. So bear with me if we're covering some stuff that you might be super familiar with and if it's anything that's either a little bit beyond you, we're always available for any questions through our inbox and we'll provide contact details for that at the end and probably when we send out the meeting recording.
So yeah, I've also got my team is in the chat today so I'll be leaning on them if we need any Q&A support. On that, if you could just hold any questions to the end and we'll have about 10 minutes at the end to run through anything. We're also going to be diving, sort of skimming over the top of some of the stuff and more like showing you some of the potential.
But to really dive deep into what needs a bit of time and effort that we won't get through in 25 minutes. So, yeah, I'll share my screen, we'll get into it.
So first of all we have the sort of the assumption that everyone's already familiar with the LINZ Data Service and type of work that we do. But just briefly, it's a data portal is about 1600 layers or so.
Okay. So I go to browse, should tell me, ok about 2700, short by a thousand, but really 2700 layers in there. And a lot of it has to do with the property data set. So then Landonline, our aerial imagery, elevation models, Hydrographic, Topographic data and the Gazetteer, that sort of thing. So we're sort of skimming over quite a few different layers today just to get a bit of a broad idea of it.
But I thought we’ll go on first, kick off just with a basic overview of what you get out of the box when you're looking at API use. So go say the Property Titles layers, a common one that very, very high use. So you can see will admin behind the screens that you it's quite a high use layer.
It's also a high-volume layer where there's a lot of content inside it. So on the Services and APIs tab, you can see this, this four options here. So broadly the WFS, if you're not familiar, this is effectively streaming data directly from our service and to yours. Quite commonly, there’s applications like QGIS for this kind of web service, it's a little less common for AGOL.
We'll get to that in a minute. With ArcGIS online or Arc Pro there’s services for that is kind quite a different service.
WFS Changeset, this is, if you haven't come across them before a way to keep downloaded datasets up to date without having to download the entirety of a dataset. So, this is I think 2.3 million objects in this dataset.
It gets updated at least once a week. You don't want to have to download 2.3 million features every week. So if you are managing through a download service changesets are absolutely for you. And again, we'll get to that a lot more detail later. Spatial Query: it's kind of like a little buffer zone you can output and the CS-W this is of these two here common across the vectors on the raster datasets it's the CS-W is catalog services so effectively metadata for the whole site and that's going to be common across the as well.
We're going to more or less ignore these two today but if you want to know more about them again, you can come and talk to us afterwards. So yeah, will just quickly duck into the rasters as well and then we'll come back here and take a look at how to use these sorts of in practice in here as well.
You've also got this other sort of window here which is showing you a quick view for all of the API keys that you currently have set up for any of the work you do with API, you need an API key that's kind of your little token that lets you access the service and you can access them through here.
So all the ones that I have set up going to be listed here, it's easy to create them. You just hit create. If you feel like 99% of the work you do, data access only is absolutely appropriate, and you just give it a name click add and you get a key and then it's gone. So I'm going to delete that.
So you guys can't steal my API key. It hides them automatically. And if you're using the manual ones, you've got to save them once they open up because they'll never reveal them again once you've acted on it. But you can for the basic ones, expand them out so that just quickly into the API stuff. So once you've set up an API key, it's going to automatically populate.
You don't have to go and add these and any point if you're taking it out of the UI. So it'll just give you a list here of all the ones that you've got and it'll edit it where it says API token. So if I just jump over to a raster layer. So if we're looking at the Wellington urban aerials again Spatial Query API and the Catalog Services, they're always there.
But the standard thing, the alternative to WFS with feature services is Web Map Tile Services, WMTS. Now these are kind of like a pre-rendered sort of caged way to distribute data. It's substantially faster in terms of immediate access than download the datasets you, you just get sort of small subsets. You get small subsets of the the data as you're sort of zooming in and out and it won't return more than you need at any given time.
So you can be quite efficient with how it's being distributed. Every raster layer. So the aerial imagery, the elevation models of the DMS, the DSMs have a WMTS feed attached. And if you have been around the block for a while, these took well superseded Web Map Services of the WMS services that we used to provide. So they're totally gone.
You won't find them on LINZ Data service anymore. Yeah. So as a part of this we've also got another application that was released a little while ago called Basemaps. And this here is kind of like a stitched together version of all the variable imagery for national coverage into a single sort of backdrop-based map. And it's very responsive as you zoom into areas and the rendering is as incredibly fast.
What it'll do is if you notice the colors changing as you zoom in and out, it's going to pick whatever the best available data is at that scale at that time and display it for you. So again, this is used, you can plug it into your system and away you go. This sort of image screen here is kind of like a preview window.
This is not downloadable in any way through the Basemaps website, but all the data that is on the Basemaps website you can get from the LINZ Data Service. So this Wellington dataset is feeding into the Basemaps service and again will show how to hook it up and access that on through ArcPro and QGIS in just a little bit.
So that's very broadly that's the out of the box services that you get. We'll load them up and see what happens. So I put that up. People from councils here, so probably a mix of QGIS and ArcPro and I'm not sure what most people here are using so I'll try and show both. Start with ArcPro.
So this is the Basemaps service loaded it. To connect it up it's pretty straightforward. You go to the Basemaps website down the right here, you've got all the different options. I grabbed the WMTS NZTM2000Quad, just hit it, it'll generate an API key for you automatically. This expires every 90 days. If you're needing something that's more permanent, you can contact us and we can give you a permanent API key.
But we track the ones that are 90 day expired. Just this there's no sort of system throttling happening. If we go back to here, the connections and the server WMTS server, you copy and paste that in there. You hit okay and it'll show up for us. I won't do it twice. It just shows up in your list.
So that's it, that's how simple it is to add the Basemaps in there and then you don't have to worry about keeping your imagery up to date. This is kept up to date as we get new imagery coming in through the LINZ Data Service, it gets added into the service and plugged in. This can also be opened on ArcGIS online if you're using webmaps as an alternative to the many, many other basemaps that are produced on them.
So that's in ArcPro, in QGIS that's more or less the same thing, different way to get there, but you got to add WMS or WMTS layer and then I've got one already loaded on here, but if you were to go on there, give it a name, add, connect, grab one of these different flavors for different reasons.
This is a little bit beyond me, but I've been told that this one's probably the more efficient that these were doing, you added it in. And I've just added twice. But it's a bit of transparency on it. There it is. So that loads in and again, very fast, very responsive. So we'd usually recommend Basemaps for any kind of backdrop imagery rather than going through it manually adding all the ones from the LINZ Data service.
But there are a couple of drawbacks. Firstly, it's what like - everything is there. You're not filtering this. It's just it's kind of complete. There are ways to do that, but it starts to get a little bit like under the hood, like you're starting to get into the more programming side of things and it's faster to go and get your own at that point.
But again, for like 99% of use cases, it doesn't matter if you got the whole thing in there, the performance that you're going to get won't make a big difference. So yeah, so that's the sort of where the Basemaps are coming from. With the individual WMT is coming from the LINZ Data service, if I jump back to there for a minute, to get it, it's exactly the same. Once you've got an API in there, you can just sort of copy it to the clipboard, jump back into a service, layer, new, URL, and there you go. We’ll grab it and 2193 series because that's where our project crs is in.
If I turn that one off, you can see you can load that on straight away as well. And if you're noticing, if you saw the performance difference before the rendering and this the first time around, it takes longer. That's just not a sufficient - basemaps is not specifically designed for this purpose - but it is still very, very quick.
So you also can pull through the metadata in the information tab and this will come through ArcPro as well. So it's very basically loading a WMTS is sort of ripping through it. The other one is the vector version. That's WFTS with feature service. So if I go to the inset property titles and I will show this one only in QGIS and I’ll do an alternative in ArcPro, it's exactly the same process.
We just select the API we want to use, save it, head over to here and I’ve already connected it here. But again, you just copy and paste it and give it a name, click on it and click add. Now this is going to something really interesting. It's not going to be highly performative. If I zoom out so we try and get the entirety of the country,
This is going to be shockingly slow. Why would I show that? It seems like a bad thing to do. We've got an alternative to this. The speed at which this is happening is because it's trying to load in millions and millions of records. And as you can see, it doesn't happen particularly fast. Now you can wait, be patient, save it, export it, do it manually.
That's fine. But it's it's kind of rare. I'd say that most people would need the entire country titles data usually let's say it wouldn't for council you the specific area you want to know what's an error or something. You're wanting to be able to sort of nail down a particular area. So a lot of the time you'll find that you'll default to downloading because it's easy to draw a box proper or separate it to the territorial agency that you're looking at.
And that's fine. You know, if that's how you want to manage it, it's wise that you can keep that data up to date. But we use a thing called spatial filtering or attribute filtering on the WFS calls. If you wanted to still string that data in or to generate downloads or CSVs or whatever you want to do, we'll go into that in a little bit.
I just wanted to show you the, you know, you can plug it in on the smaller data sets. If you're doing say to reload, you can remove that. If you're doing this on, say, a small dataset like territorial authorities, that's going to take a second to download to stream your dumps, find it quick, but on the larger ones, it can become quite a bottleneck. If you're using something like ArcPro they use the ris services. So this is the same layer and I will say that for the use of ArcPro this is this is quite efficient, still taking time but they will pop in in a second. To add these and sort of connect up to the in this case, it's coming up through my catalog. You can add these three services, but this is just where I pulled it from.
If you got it locally, it's just coming from this one here. You can see the pathway there which you type which isn't particularly helpful. Your service item. Now it's not enjoying this loading anyway. You can see how it works. It's still massive. So not the greatest thing in the world, but it does load more efficiently as a as a service in ArcPro.
And now with the web feature services, this won't load at all. If I try to connect this in from the LINZ data service to here, this kind of in-built policy reasons, I guess why history doesn't play nicely with the ODC operations there. It does have the functionality in there to add them in and just I've never made it work.
So yeah, you do kind of get, get a bit, a bit stuck with the WFS one there, but it does load in rs services quite nicely. We don't have comprehensive services for every products. There's additional costs and overheads and everything with hosting open data on two separate platforms. But if you want any further questions on that, it's just come and have a chat with us afterwards.
So those are the basics, the basic way to connect the data. You just sort of go in there, you copy the layered WFTS and all their capabilities and copy and paste it onto your chosen application and, and it loads, but you can get a little bit more specific than that.
So let's say, for example, you don't want all of the dataset, or you're using download services, but you're exceeding the LDS download limit. LDS has a 15 gigabyte download limit for example. You can use APIs for things other than just streaming data. You're not limited by it. You can kind of bypass the UI in LDS in quite a few ways, which can be quite handy, particularly in process automation.
If you're keeping things concurrently up to date, you don't want to have to go in there every time and hit the download button. One of the big ones. If you're downloading a tile area now find - I think it's the Taranaki - this one. Yeah, that's right. Yeah. If I add this file to a map, I’ll get rid of this one and let’s zoom too.
So it's quite a big area. This is quite a dense dataset. We extend the crop out for the whole area. We go to go export it. All right, so it's almost 30 gigabytes. So it's double the minimum size, the maximum size requirements for exports in LDS. Your options here are basically to change the crop so that you’re getting less data, make multiple downloads and stitch it together or pay for a courier service, which, you know, is never a thing you want to do on an open data platform.
So you can use the APIs to sort of break this down into more manageable chunks. Now, all of the raster services that we release, they're broken down to tiles, quite small ones, but they're a regular grid. So if you know what your tile list is, you can set up the download to specifically crop 50 and 50, but also because they're always coming out in tiles, it doesn't really matter if you have one download or 20, you're still getting 30,000 Gtiffs coming out of it that you think got to manage so you can break it up with exactly the same kind of data output.
So to do so, you kind of need to build a query around your APIs. This is an example of use. This is a platform called Postman. It lets you create post, get requests, but you can do some python SQL whatever your chosen platformers. But this is kind of a way we structured it. So you sort of connect up to the export url to find the tiles that you want.
In this case, we've just gone for three, in the headers, it's got all of my API key, which I won't expose here because one of the admin ones, the language doing it in this case we're doing it in json what this is going to do is send a request to the export service and LDS and bundle up data for download without you having to go through the utility.
So hit send. Give it a minute. Right. We get this little output. So what does that mean? It's just a rundown of everything that I just asked it to do. Most of it doesn't need to worry about too much state stamps to store the metadata about it. But if you go back to here, it's just created a download.
So it's done that without me having to actually go on to the export utility, which means you can set this up to run dramatically. And if I had download, it's going to get it as normal. What this won't do is automatically download it, but you can expose the download stream if you want to also automate that files to and copy the link address.
So here we go. I stood in here. This format is the same for every export that we do, that looks like I did Services API and then exports. And then there's a little ID here. This ID is kind of like a job ID that you've generated and this gets exposed in Postman when you generate that that export. So if I go back into here, you can see the first thing is an ID the copy and paste that head back into here and see that the two numbers match.
So you can then grab the stream loaded on through in your browser at go and it'll start a download so you can run that whole process without touching the data service UI, which can be quite handy if you're handling large volumes or doing regular downloads of data for particularly rasters with the vector datasets that we have polygons.
The way I would recommend doing that is through our changeset service so that here we go back to the property titles and into the services. This is WFTS changeset. A changeset is a way of getting instructions on how to update data that you already have rather than downloading an entire fresh dataset.
So when you create a changeset it'll give you a list of all the history tabs. You select the range that you're interested in. Say that your dataset was last downloaded on the 4th of August will own everything from 4th of August 22nd of September, and you'll get some instructions on how to update your data set with these numbers involved.
So it's a lot less than 2.3 million. So working through that, it'll take a while to download and package up, but we can skip ahead to what it looks like on the output. So this is PJ admin. If I grab this here and it go, this is a changeset. So most of the data looks pretty similar to what you'd get in the actual table, but then you'll also have a table in here telling you what to do with that.
These will be update, delete or insert update as changing an existing record. Delete is getting rid of one and then you record and then you've, you know, SQL whatever you prefer to update data with, you apply the instructions in this case, replace the ID here with the rest of the content. As a rule of thumb, don't ever open up data in Excel.
You never know what's going to happen. As an example of why not? This the topo 50 contours changeset can take a little bit longer, even though it's only returning ten. And the reason for that is the extreme geometry. This will break if you try to open this as an Excel spreadsheet that exceeds the maximum capacity for for this element.
So yeah, cvs are fine but into a database, basically all directly into a GIS application inside of GIS you can't do it of these in as well. These lines here are all CSV files, the non shaped files or geopackages or anything else. So it's really quickly the the changeset service. We use changesets internally to keep data up to date that we provide to other staff for analytic purposes.
And mainly that's the property data sets so that property titles are something. We use windows scheduled processes just every week run a batch command that reruns the process, scans LDS, pulls down the data and applies it. So again, if you're looking for more information on how to do that, please ask to us afterwards. So the whole thing as well, we had a few questions at the ESRI conference around spatial and attribute filtering.
So quickly go through that and then I think we'll grab some questions. Sorry, I'm just sprinting through this. This is normally quite a long presentation, but I'm just trying to keep up so the if I open up here, so these are kind of these what get requests look like and this can be a bit gibberish if you don't know what you're looking for, but basically you plug them into a URL and they get to return certain values based on what you've asked it to do.
So in this case, I've got a few of these pre-loaded. Let's say we want to grab every I give you some of the examples that have come from the documentation page that we've that we've got. If you want to grab and hold over here so that I'm kind of hiding some code just from page you xml here we go.
All right. So this is operating on the roads dataset. Start it right here, this copy and paste it. I'll show what looks like for you, actually explain it. Some of the get request. So you are using a sort of a standard query language inside the the URL string to export information about layers or the data service itself. And if you're not adding any additional commands into those, this can be exported complex email format.
So this is simply giving us a list of all of the field names and the roads dataset. So we've got the name, highway number identifiers, line numbers, status and service. So if you say you're doing work on, you want to know what's every road in the country that's a middle surface. We can export data using these requests by filtering on these values, for example.
And there's tons of ways you can do it. That's sort of one way of it. And I'll share a documentation page. We list at all the different, different ways you can do this by locks drop or radius around a point, that sort of thing. And so if I jump over to QGIS just this is the same one.
So who we connect here with the data service put in an API key. We say what the services WFS it's got a version number, gives it a request. We want to get the features and then we add filters. So this case, it's at and we're going to go for the names of the layers.
So this is the name of the layer. So it's like 50329. These are available at the data service. And then what the filters are. So in this case, we're looking for old services that are middle, middle and I want to output it as a CSV you head into and it'll just start a download. So the value of doing these, anything that's based on a string means you can plug it under code and have it sort of push out of the browsers and automatically download remember doing it manually here, but you don't have to.
That's kind of the key. And then once we've got it, you download it. Now I'm going to skip ahead the data extraction process and show you what that looks like coming out. I turn that on and zoom to the layer. Hopefully you'll spot a problem with the data pretty pretty quickly. So this is exporting everything we just asked to do.
This is every middle surface road in New Zealand. If you're eagle eyed, you may notice that it's sideways. This is what they call it axis flipping. So depending on the software you're using, the different versions handle that coordinate transformations differently. So if I turn this on and zoom out, you'll see it's that's flipped the coordinates. So this other one is the same thing, but I've adjusted the version number basically, and I'll give you instructions on that.
I point this out because if you follow the instructions directly from our website, it's going to flip it. If you're using something like QGIS 3.x whereas it might not do it if you're using another replication. If it does, that's actually flipping. It's easy to correct. Just follow the instructions. So this is pulled in here. If I open up here, you can see no matter how I water the surface, it's only going to give me the middle surfaces.
So it's a 27 221 middle roads in New Zealand, which is a lot more than I would have thought. But there you go, not everything's sealed. The other one you can do is, as we pointed out before, we tried to load on the titles table and it loaded everything and it took quite a long time. You don't have to do that.
So similar to what we've done there, we've created a filter we can filter on we go this one here. So it's the same thing. We're going to hit the data service, put in a API key so it's obvious what the service's version number and that's one or two is going to do it. That's what determines the axis flipping, get the features, what's the layer name and then filter.
In this case we're going to say we want the title number and we want a specific title number. So I've gone and grabbed quite a large one and I'm not putting into the CSV that could be a Json or any other format you want. But in this case, CSV what that's going to do is output a single record.
So that's the same data set that we've gone and modified this WFS feed to be specific to one record because all the data are still contained in there. So you can sort of get rid of it before it hits your service. And then the rendering is substantially faster. It turns on and off. You can stack these up as well, give it a range.
So in this case I’ve grabbed Kapiti Island as well, so you can add them in again, quite a simple thing to do. I go back to my query, the one below, you just give it a range and I grabbed two of them so you can stack as many of these as you want, as will fit in the URL. Stacking URLs can have limited capacity, so it's not an excellent thing.
If you need like 2000 of them. But if you just got a few that you're interested in, you can quickly check on them. And these are quite handy because if you're doing these commands quite frequently, you can kind of just keep it handy to swap them in and out. Yeah. The other thing is doesn't have to be CSV.
So if I go back to here, this is exactly the same thing, just exported in the window so I can go in and see information about it. This is everything to do with that particular layer just in the browser. So if you just after information or really really quickly, you don’t actually need the special service there it is all exposed will also give you the geometries.
Again, example of why this doesn't play well in Excel. You don't want to open this. You scroll down and the second one. So they just have to go back to back. So all the information that you need about it so is like Kapiti island. So if you're on the LINZ Data Service and you hit the help button under LDS guides, the documentation is actually pretty good here.
These are all under the web services and mapped tiles services sections, but if you open them up, they'll give you examples of how to use this. The different ways they can be used, different applications that are commonly used and versions, that sort of thing. And that's the special filtering out. I actually thought it would just be doing special filtering as well, just kind of running out of time.
That's based on geometry. So if tried to filter by, you know, the Auckland area or something, it’s kind of a handy, Yeah, So I will beat your hands up.
Sorry James I was just going to throw the questions from there. Thank you very much for sprinting through all of that. And I think we'll say if anyone's particularly interested in particular aspects of that, more than having to do follow up sessions in greater depth.
But yeah, while we've still I'm just conscious of time, do we still have any do we have any pressing questions, fire away? Nothing come through the chat?
So we do have the LINZ Data Service at linz.govt.nz email address so that the stuff that y eah so I appreciate I've just sort of done filtered sprint through that just kind of like a two hour presentation.
They will come from fairly common questions that we've gotten from people and industry over the last couple of years. So if you do have some more questions around them, please let us know if we've pretty, pretty good tackling them. The other thing as well if anything you want to know that we haven’t rip through just in, please shout out.
We're pretty responsive in our inbox and happy to get on the phone and talk. Well, if you want to go more in depth.
fantastic at on this just a bit of feedback that's come through, Max says just wanted to say that that was a great overview of the APIS. It would have been super helpful a couple of months ago when I was integrating with.
I apologize for the timing there, Max, but better late than never, I'd say. But yeah, if you do have any other questions with that you're doing in future, as James has said, the team is really responsive and happy to help and anyone else as well on the call. Yeah, thanks everyone for dialing in today. Thanks again to James for ripping through that at lightning speed at about a quarter of the time you usually get.
And yeah, if you do want to let us know if something in particular is of further interest, just sing out and we can set up keep it open sessions and yeah, otherwise thanks again and have a great one and we'll share the recording afterwards as well. Thank you.
[Ben Reilly begins the meeting]
Right we might kick it off there. Good morning everyone and welcome to the fifth webinar since we started the Aotearoa Property Data Network last year, it's great to have you all here.
I will be chairing today and we've got the brilliant Deb and Lisa helping run things in the LINZ office and a group of presenters today that I think you'll really enjoy. So I'll just kick things off with the opening karakia.
Whāia te mātauranga
kia mārama, kia tupu,
kia tiaki ngā whenua,
ngā moana, ngā arawai
Kia whai take ngā mahi katoa
Aroha atu aroha mai,
tātou i a tātou
Toi te kupu
Toi te mana
Toitū te whenua
Haumi ē, hui ē, tāiki ē!
… and that is just to say in English that we,
Pursue knowledge for
and caring for the lands,
bodies of water and waterways
Seek purpose in all that we do
Let us show respect
for each other
Hold fast to our language
Hold fast to our spiritual strength
Sustain the land
Gather and go forward together
Right, just to run through the agenda. Updates to start with, as usual a variety of colleagues here. Then I will be talking to an innovation initiative that we are currently exploring, called Joined-up Land Development.
Then we have got Zeniff from the Māori Land Court doing a live demo of their new and much anticipated Pātaka Whenua that has just launched.
And then we have got in the second half a round of Emergency Management-themed lightning talks, with Auckland Council, GNS, Manaaki Whenua, and Toitū Te Whenua each contributing. So, yeah, a great agenda there.
So to crack straight into the LINZ news and updates, I'll talk to Notice of Change to start with. We are stoked that we've got all 67 territorial authorities now signed up for the Notice of Change, with Dunedin City Council now being onboarded, or in the process of being onboarded.
That completes the national coverage, which is really exciting, and we can now really start to think about our next steps there. In terms of uptake, we're seeing solicitors’ uptake figures of north of 80 percent, which is also really positive in terms of processing those Notices of Change.
And some new enhancements coming through as well, through the Modernising Landonline programme, we can now - or solicitors can now make amendments to information after the initial Notice of Change. So they can make corrections or keep editing after the initial registration has gone through, so that's really helpful for them, and to make sure that we get good information - good quality information - through the rest of the property information system. So yeah really good.
And now Connected Property Data. So we have got a Hybrid layer, restricted access - I'll pass over to Mike Judd, our Senior Business Architect in the Property Information Team to talk to this one, Mike fire away.
[Mike Judd begins presenting]
Good morning everyone. Connected Property Data, a couple of updates. Let's start with the Hybrid layer. So the Hybrid layer is effectively what we're calling the best view of a property that we can deliver at the moment, and it's effectively our Unit of Property layer in-filled where there are gaps with titles from the Property Titles layer, and where there are still gaps in-filled with parcel data.
So in one layer you'll always be able to click somewhere and get the best possible information on a property boundary that we can give you. The development work on that is now finished, so we're just waiting to dot a few i’s and cross a few t's, and then it will be available on the LDS, at the moment with a government - central or local government - access restriction on it.
However, we are looking at actions to possibly make those boundary data sets more available in the future, so keep an eye on some updates on that coming up, maybe in the next - in the next meeting. But at this stage Hybrid layer should be available very shortly for central and local government use.
Other updates to NZ Properties Data available on the LDS for central and local government use - we've added what we're calling a parent-child table to the set of tables that basically links together properties, where they are related to each other. So for instance where you have an apartment - unit-titled apartments - and they're all in the same development, we can link them together using this new parent-child concept.
Another instance we've used it is where we've got pest releases, part of which is Crown Lease and part of which is Freehold, we can use this parent-child to bring those two different land - areas of land – together.
So there's an update - there's a new table - and it should be available now on NZ Properties. Another change we've made to NZ Properties recently is we have changed the unique identifier for our Unit of Property table. Previously it was just a sequential number starting at one, we've now changed that and it's a Universally Unique Identifier. Do we've introduced a UUID for that Unit of Property table, and again that should be available now. And one more update from the Connect Property Data Team - that is a new release of the Property Data Management Framework release 1.2 that was put live on our website in May.
A whole host of minor changes to that, a lot of changes to make the text more consistent, update all the diagrams so they're more aligned with best practice, some more examples were added. But the big change was the introduction of a lot of material from the Property Spine Project we did last year with Stats NZ, which talked about how we would incorporate dwellings into a property spine.
So a lot of the lessons we learned from that project have been incorporated into the PDMF, and that's now all available on our website, and we're now thinking about release 1.3. Ben that was it, did I miss anything?
Brilliant, no, I think you covered it all nicely there, Mike, thank you very much. Cheers.
And next we have Susan, our Senior Resilience Advisor talking to Suburbs and Localities dataset.
[Susan begins her presentation]
Thank you Ben, another project that Mike's been heavily involved in and it's a real pleasure to tell you that Suburbs and Localities has now been formally published on the LINZ Data Service. I - four or five years ago - I never thought we'd see this day, so it's awesome to see that it's finally out there!
So it's on the new-look LDS, which Vic is going to tell you a bit more about in a moment, and it's also available as an ArcGIS REST Service.
Do you want to flick me on, Ben, is that gonna work? Cool.
So just for those of you who are interested, I think we just talked to last time - Trent mentioned it might be a possibility - but we've had great support from Stats New Zealand and we have been able to launch with a population estimate. We are working on that number of decimal places because that doesn't really work as a population count, does it. But the population's in there, we really appreciate that Stats are going to maintain this going forward. It's relatively old data at the moment, it's 2018, but basically it's a placeholder so we can put the new census data in there once it becomes available from Stats. Thank you, Ben.
The other - and so basically our customers told us two things when we took responsibility for the data. They wanted a more simplified data structure and they wanted to make it easier to request a change. So there's some documentation around how to make that change, and I'll just take a look at that in a moment, but - I just want to go to the next slide, Ben.
So that if you remember nothing else from this morning, it's the following key dates. So we're looking to take down the pilot layer Suburbs and Localities in the middle of August, so that's bit so the new Suburbs and Localities replaces the pilot. And we're also at the same time - just to try and help with their communications - we're looking to take down NZ Street Address.
So if you've not adopted NZ Addresses already and, well, communications will be going out shortly to remind you that that will come down in the middle of August, now, Wednesday the 16th of August.
And then to give people - we always try and give people around six months to do a transition - and so we're taking down FENZ’s NZ Localities, which is still a well-used dataset. We're taking that down in November later this year, so I just want to take a moment - just to take you through that in a little bit more detail.
So we've got the data on LDS and we've got this page on the LINZ website, and if you scroll down there's a Data Dictionary, there's information about how to make a change request, and then there's the principles, requirements, and rules that govern that change request - and a big ‘thank you’ to Michael Brownie and the team at Wellington City Council for helping us with that.
And so basically you can either put a change request in just by emailing addresses@LINZ, or if you're a TA you can request restricted access to this change request form. So if you thought the Chatham Islands really could do with a new locality you'd just get access to this form and you could put in a new name and say that it's a locality. If you can spell and then you could submit that, and then that gets considered by the team. And so two ways requests can come in, via email or via the app, either way that information will go back to the Territorial Authority or to the New Zealand Geographic Board, and if that - that change will either proceed or be declined at that point.
And I just wanted to introduce this concept of minor and major changes. So if it's a minor change, it's just a simple realignment to a parcel boundary for example and then that will just get processed and it will go into the data. But if it's a major change, so it's a significant boundary realignment or it's a new name, then that will go to a review panel and that team will meet every - three times a year - and so their first meeting will be in November. So just wanted to share that with you, and that's all from me, thank you.
Brilliant, thanks Susan, I will just share from my end again - if it wants to do it - there we go. And so that was all for Suburbs and Localities, and now we have Victoria Lindsay, Manager Open Data and Reuse talking to the LINZ Data Service.
[Victoria Lindsay begins presenting]
You may well have seen late, late last - last month - the LINZ Data Service rolled over to the, the new Koordinates user interface. So we were the one of the participating agencies that use the Koordinates platform. We were the - the last one to do so - so you probably were really familiar - likely to be familiar already - with some of the new, sort of, features and functions and ways of working that that interface presented.
So, we're pretty happy to be on this new interface. It's been a long a long time coming. It's been about nine years since the last major update so I recognize that there may be a bit of a hump - with getting used to things. But the good thing is that there is lots of things that you'll still recognise, and some of the things are still quite similar. But there are some new and different things that hopefully you'll get some benefit out of as well.
So some of the key things that you might enjoy, that if you haven't already sort of had a look around and found them. There's the ability to favourite data sets so you can, there's a - in the slide there - there's a star on the right so you can favourite them and sort of curate a list of the datasets that you can always go back to.
Some other things that are different, too, the ability to create multiple map views, which are like snapshots of different layers that you have added at a time and they can sort of - they’re persistent as well in a session so you can come back to them. So a little bit of some - getting into more of that, sort of, customisation space. The other thing that you may see that's a bit different is the export process. So that integrates - that's more integrated now, and there's a couple of ways to export things. So it sort of provides a bit of flexibility in the ways that you want to grab your data, and also at that step the API’s are integrated there as well. So it actually surfaces our API’s much more than they used to be, especially the ability to use the changeset, so hopefully we'll get more customers making use of those services.
So we've had lots of feedback coming in over the last few weeks and if you have some more please send it in. There's some tweaks that are being made still, this is not sort of set in stone, so if there's something that you want to let us know about - good or bad - then do that, that would be awesome.
The other thing of course that the new UI will support is the publishing of terrestrial point cloud data. So that's in that sort of final testing stages at the moment, and this is part of our drive towards consolidating some of the channels so it's making it easier for you to get our data, providing it through LDS to this single interface.
So that will enable you to export the data in layers and less formats, as well as some GIS formats, and in the future there will also be ability to do Point Cloud streaming. Another thing that will be part of that rollout is a progression towards a 3D map viewer as well, so you can actually visualise and get the best out of the, sort of, view of that data actually in the interface, too.
So there's some pretty cool stuff coming, and, yeah, and coming in – within the next - within about the next quarter.
So that's the good stuff from us, so yeah, again, just another call out if you do have some feedback, send it, we’d love to hear it, thank you. And you've got a poll running.
Thanks Victoria, yeah well there's a poll that's gone up that says “how are you feeling today”, I think, if I'm seeing what everyone else is seeing, and it's supposed to be asking how people rated the data service’s new UI, so I'm not sure what's happened there!
We’re feeling great!
Yeah good! That’s awesome!
We’ll just assume everyone loves the new user interface, as well, because it does look fantastic.
Okay thanks again, Victoria, and now if it works we'll hand over to Zeniff from the Te Kooti Whenua Māori to give us a live demo of Pātaka Whenua.
[Zeniff Haika begins presenting]
Thank you very much for inviting me to speak to this forum. Some months ago, oh was it last year perhaps, I think I shared sort of ‘where the Māori Land Court was heading’ in terms of its technology, etc., and its way of working. And recently the Māori Land Court rolled out Pātaka Whenua, which is the information system that we use in the court, and its intention is also to replace Māori Landonline, which some of you may be familiar with.
What I’m intending to do today is run through a bit of a live demonstration of what Pātaka Whenua is and what you can do in Pātaka. So I'll quickly share my screen, and you'll probably see a little website. So the easiest way to access Pātaka Whenua, I think, is via the Māori Land Court website.
So if you just throw into the search bar ‘www.maorilandcourt.govt.nz’ it will bring you to this page here. You'll see that there’s some messages at the top, as you do with technology there's often issues that come up and so some of those are being worked through by our teams that are involved with that particular piece of mahi.
But this is the website page and it's all about connecting whānau to the whenua – Te tuhonotanga o te tangata ki tōna whenua - and in here what you can find is pretty much everything you need to know about the Māori Land Court – Te Kooti Whenua Māori.
What I will do today, however, is we'll go straight into Pātaka Whenua to see what that looks like. Now to get in there you can either go straight there via these links at the top, creating an account or logging into Pātaka if you have an account, or scrolling down - after we get messages from various people - you can also access it via the link down the bottom here.
So we'll click into there and hopefully something comes up – cool. This is the landing page for Pātaka Whenua. So in here what you can see is you can either - you can register as a registered user, what that enables you to do as a registered user is, if you submit an application, you can then track the progress of your application through the various processes that it goes through in court, and these are the types of applications that you can file via Pātaka Whenua [referring to screen].
Of course you don't need to create a login, you can also submit your application as a guest user, and as part of that you would enter your email address, and then any correspondence that comes from the court will be sent to that address. So you would still be able to keep in touch with your case manager, if that's something that you want to do, or even contact someone at the court to make an inquiry, or anything like that.
But what we will do today is simply look at some of the Search features that are in Pātaka Whenua. And so if we click on there ['Seach' button], these [referring to screen] are some of the search functions that you can search or use via the customer portal.
So you can search for any Blocks.
Documents are there but at the moment we're just going through a bit of a process, we need to update our Māori Land Court rules to make these more accessible to everyone via the portal. Access to the court record is governed by Māori Land Court rules so we're just making some tweaks there, but the intention is for - in terms of documents - things like Court Orders and Court Minutes will be readily available here, via the portal. And so you can view some of those documents, once we get the green light for that to happen.
Management Structures are any land trusts or family trusts that are owners in land or hold land for administration purposes.
And then of course Ownership is where you can search for an owner in Māori land, and any ownerships that they hold across the country will come up there.
There's also the Block Map there, which if you're familiar with Māori Landonline it's very, very similar to that, where you can sort of zoom in and find your blocks and everything. I will state though sometimes depending on your browser it may take a little while to load all of the details in that.
Let’s just zoom in and see what shows up. So yeah similar to Māori Landonline, if it's of the status Māori Freehold Land, it will be sort of highlighted like that in grey and what forth. And then once the page loads proper, you can click on one of these and then go directly to that block from this page as well. So similar to a lot of the other online features that you get with this type of technology.
Let's go into the Block search and - if I'm going a bit too fast, call out and I can slow down, or happy to take any pātai that you might have as well. When searching for a block, most of us would use either the block name or the legal description. If you know the block ID number, so that's a Māori Land Court reference, you can enter it in there or if you want to search by a district in the Māori Land Court, you can also search there. If you have the LINZ title reference you can also search via that reference. So if I just type in one block and then we can have a look at what that looks like.
So Rotoiti 15 is one of our blocks in the Waiariki district. And so, this is what we call the Block 360. Everything to do with the block is displayed here. The block ID number for future reference if you want to search for this block via that reference. Any alternate names if there is any. The total number of shares area details. Anything to do with the land effectively. And we also get the block shape in there of all the various blocks in Rotoiti 15 here, which is the block that we were looking at.
As we scroll down the page, this is probably some of the info that may be of benefit to some of you on the call, in terms of land administrators. So that's where a block is administrated by what we call an Ahu Whenua trust, or a Land Administration Trust, they are the Registered Proprietors on the title, should you need to engage with any of the owners there. In here it shows who the trust is, and then you should be able to click on there, which then takes you to that particular trust and lists the current responsible Trustees of that particular trust. So that is information is there as well.
You can scroll down to the ownerships. So you can see who are all of the owners in this land and how many shares they hold, and the ownership type, whether they’re a tenant in common or any other type of owner. And this is a new one that came about as part of the amendment to the Act in 2021, where the Court can also grant certain entitlements and rights to individuals, and so these people here have been granted a right to any income that's generated from that land, which is to be paid to those particular individuals.
So in here what I did want to show is what we refer to as our Memorial schedules. From a data perspective this may be of some benefit to some, where if the court grants ... [pauses due to background noise] ... If the court makes an order such as for granting sites of occupation to various whanau members then we would record that in the memorial schedule, and so, if you're interested in data such as dwellings on Māori land we would record that in the Memorial schedules for each of the blocks. We have had some requests from various organisations for some of that data so we can also extract that data from the database to provide it in such a way that would be of more friendly - user-friendly - for those data purposes.
So that is what information shows up in Pātaka Whenua for a block of course we have our LINZ details - our parcel ID references and the LINZ title references. Every month we - well - I think it's, no sorry, every couple of weeks a couple of us from the Māori Land Court engages with a couple of our colleagues from Toitū Te Whenua to look at this type of data, to ensure that it aligns. You may be aware that often changes are made either via the Court or via Toitū Te Whenua. And so when those changes occur we need to ensure that the data in both of the Registries aligns, and so we have those discussions with Neil and others in LINZ, to work that out and make sure that the information that surfaces is correct in both Web Search and Pātaka Whenua.
Let me close out of here [referring to screen]. In terms of searching via a LINZ title it's just as simple as adding the number, and then pressing Search. Well I say it's just as simple but maybe it's not – oh - I need to remove that first, as you do. And here we go. Clicks into there. And then again, this is a block up in the Far North, Takou block. There was some recent surveys done for a Little Takou Island, which those of you who may be interested - whoops - is where the Mata Waka sort of landed as it travelled North from the Tauranga area.
So, yeah, and if I click in here - I think here, yep - we get a bit of information around licenses to occupy dwellings, etc., that are located on this land. I think there's probably about 20-odd dwellings on this land but it is captured via the Memorial schedule following the granting of orders or licenses to various shareholders in the land.
If searching for a Management Structure, it's simply adding the name of the trust. So we can even go for one of these [selecting from a dropdown list under the ‘Organisation name’ field], and then it searches, and then it pulls up the 360 and it shows you who the trustees are of that particular trust.
Same thing with the ownership, enter an owner's name in there [the 'Owner name' field]. And then what it will do is simply bring up all of the ownerships that this person holds and then you can click into there to go into better detail as to what those ownerships hold. So in terms of what's available in Pātaka Whenua, all of the Court record effectively is available, although we haven't yet released the documents, but that will be coming up and so you will be able to access that. In the meantime what we're advising all users is, should you wish to do a document, to submit an inquiry and then we can email that document to you. So, yeah this is Pātaka Whenua.
Kia ora Zeniff, we’ve got a few question, which have come up in the chat. One of the questions from Michelle is, “just wondering if there's a way to download the data as a GIS layer, please?”
Yeah that's - we had many requests for that type of data! That is something that we are working through with the data team from the ministry as well. So we're sim- we're understanding where the data sits in the database and then what we need to extract to provide that. I know we used to provide it. The last sort of shapefile I think was from 2017, and I know that there has been an interest in getting an update of that dataset, so that is something that we are working on at the moment. It's not downloadable from Pātaka Whenua itself but it is something that we are aware there is an interest in, and so we are working towards making that more available to people who request it.
Kia ora, thank you, Zeniff, the next question is from Fraser, and he's asking “how often are the boundaries and or the attributes of the features updated?”
Boundaries are usually updated as a result of a Court application for either a partition or something of similar nature to that. Sometimes what the Court can do is if a boundary is incorrect, the Court can make an order to realign that boundary, but then there'll be survey plans and what forth produced as a result of that. And so that will go through the process of being updated in the cadastre and then carried over to Māori Land Online. So, not often do the boundaries, etc., change, because they're not a very common application for partition or subdivision, but that is how they would change if they need to.
Thank you, Zeniff, the next question is from Anya who's asked, “has any work been done on the areas attributes for amalgamated blocks?”
I'm not quite sure what that’s referring to. I know there's some work going on for amalgamated blocks, but was there anything in particular? Is it the area of each of the blocks that have been amalgamated, is that what we're talking about?
If Anya’s online, would you like to ask, Anya, your question to Zeniff?
Yes sorry, it wasn't very clear. I was using the MLC 2017 dataset for analysis about areas of blocks, and there's an attribute in that dataset that actually gives you the, I think, the rounded up value for the hectares for these amalgamated plots. And it seemed quite frequently, and this was for the Far North, that a value had been put in there for what appeared to be amalgamated blocks that was maybe one of the values for the part of. So it was much smaller than it actually - the amalgamated block ended up being. So I was just trying to do analysis and I was - I realized this because of the analysis I was doing. So the amalgamated block value in the file didn't match the actual area that that block covered. Sorry long explanation.
All right, all good. There is, yeah, there is some work being done in that space, particularly around the amalgamated and the - even the aggregated blocks - where sometimes the area values are a little bit out of sync. And so as part of that what we are doing is reviewing a lot of those blocks and working with, well, we intend to work with our Territorial Authorities around updating some of the values as well, for the blocks that we use. So there is something that is being looked at, at the moment, but unfortunately I can't give a sort of a time frame as to when that will be completed. But it is something that is being looked at perhaps, Anya, if you want to email something through to me, we can easily have a look at what it is that you are looking at and then ensure that it is correct.
Thank you Zeniff. the next question we've got is from Kirsten and it asks, “does this data supersede the TPK Māori land blocks layer?”
I'm not sure about anything superseding anything. What the data we have is the most recent data from the Court record and it's updated recently with GIS data provided by LINZ. I think TPK do receive the data from us, so it's likely that if the data that they've received is, yeah, before this year, then it's likely that it would be superseded by this new set, because changes are made every day as a result of Court hearings, Court orders being made. So, yes, it would be.
Kia ora, thank you, Zeniff, the next question is from Chancell, who says, “forgive my ignorance but there are only six districts in the list of the block search, is this normal or will the list be updated?”
Maybe I’m looking at something else, but to me there’s seven there. But if it is wrong then yes it will be at least, but at the moment it looks to be right, but yeah.
Kia ora, thank you, I don't know if anybody else has any questions. Those are all the questions that we have in the chat at the moment, Zeniff, thank you very much indeed.
Cool, ka pai, thank you and, um, Ben you have my email address or should I pop it into the chat in case anyone wants to email a question through?
Feel free to pop it into the chat but I can also - I've got it - and I - we - can distribute it with the notes after as well, for sure.
Okay, cool, ka pai. Cool, well, no more questions then that's me!
Sweet, thanks a lot Zeniff. It was really great to see that go live after much anticipation.
Great cool I will share from my end again now
Sweet, so, I am going to be talking now about Joined Up Land Development, an innovation initiative that we are exploring here at LINZ. [Referring to screen] If it goes, cool.
So, the problem in a nutshell, resource consenting is too hard, it takes too long because it's disjointed, it is labour intensive, it lacks transparency. This leads to project congestion, additional costs, prevents evidence-based decision-making, and there is no visibility of development pipeline for LINZ Property Rights teams.
So, we at LINZ cannot anticipate the resourcing needs that we need in terms of receiving and approving survey plans, and then beyond that the request for title as well. So that creates a congestion for us, and we typically are at the end of that land development process and bear the brunt of any delays that have happened up to that point, and projects wanting to catch up on time. So yeah, but a problem shared across the land development sector is that there is no visibility of the development pipeline to understand what's being built and where at any given point in time.
This leads to deficits we all know about in housing and infrastructure. In the context of population growth that tends to be underestimated if anything and will continue on. Increasing complexity when we try and grapple with the full complexity of the environment for environmental management. And we have got a climate imperative as well, bearing down on us, that we need to try and mitigate and adapt to, and that doesn't help when it's difficult to get things built.
So, the goals of Joined Up Land Development would be to: integrate the land development process and all of the stakeholders involved in the process, increase transparency and promote efficiency across the land development system.
We've got some overseas examples that we're looking at. SPEAR in Victoria was launched just over 10 years ago and you can see there they have joined together the surveyors, councils, referral authorities, the land use authority. Very similar sort of environment to what we've got here in New Zealand, and a similar number of local authorities that they worked with and brought into that program as well.
And then in Singapore they've got a sort of, a couple of generations older system called CORENET that was initially built in the 90s. They've of course got an advantage being quite a small state and relatively easy to join together centrally. But yeah, definitely some lessons learned there and interestingly for them it goes a bit beyond land development into sort of property management as well, facilities maintenance, and things like that.
So why LINZ? We are the stewards of the property system, and subdivision portion of land development is fundamentally about creating new property. Any data created at this stage and any issues with that data flows right through the entire property data ecosystem and persists through the system. So we want to make sure that that is right first time as much as possible. We also have strong stakeholder networks across local councils where we work with them on various things, providing GIS, sourcing aerial imagery and making that publicly available, and LiDAR, through emergency management, through addressing, Notice of Change which we talked to earlier, and valuation as well - property valuation for rates. We also through, by dint of being LINZ, we have good working relationships with the surveyors, and the property lawyers, especially with Modernising Landonline and those professional networks that we've tapped into as a fundamental part of that programme to help drive the design and success of that build.
Regulatory roles as well. So through the Surveyor General and the Registrar- General of Land we have direct involvement in a portion of the land development process, alongside the likes of Ministry for Environment and MBIE, who are more commonly thought of as the regulators of that, of the parts of that system.
The Valuer-General as well, of course, has an interest in the outcomes in terms of quality of information that's generated for property valuation, which is critical for local councils for their rates and their and their revenue. So we want to make sure that's working well. And in terms of our strategic outcomes we have an outcome to be a trusted regulator, delivering fair and transparent regulatory systems, so definitely one area that's ripe for improvement on that front.
Because we can. Because we have IT development capability being built-up through the Modernising Landonline programme, where we currently have just over - between 120, 130 developers working in-house across 14 different squads, operating on a Scaled Agile Framework.
We'll be retaining a good portion of that capability going on, both to maintain and continue to enhance the New Landonline, but also there will be some capacity to look at other projects as well. So that is something new for government where we will have that capacity in-house to do IT development in a significant way.
And just in terms of the property system and where it sort of - its centrality to a lot of other things - I wanted to show up this slide. [Referring to screen] So yeah as you can see that that property system stewardship flows through from property into a lot of other systems. So it's sort of the hub at a lot of other fundamental and critical systems for New Zealand.
So really important that we are - we're getting that right - and an opportunity to intervene here can flow on into a lot of other a lot of other areas of our day-to-day and the running of the country. So yeah, an important role there.
So, in terms of where this fits in terms of our modern - current work at the moment, for Modernising Landonline. We are currently in the migration stage at the moment, where we have stood up the Dealings App and the Survey App, and we're onboarding, well in train, now, onboarding firms and users into those new apps.
And a significant milestone this year will be turning off survey capture in the ‘Legacy’ system. And that will be the first really big chunky bit of the old system that we turn off. So that's a really big milestone, in terms of starting to unlimber us of that needing to maintain two systems in tandem, and worry about things being backward compatible, as we as we're trying to build the new system as well. So that's really exciting, and we're currently piloting that with one firm, where we've turned off legacy capture for that firm and they're just using the new.
And then looking forward. So, you know, longer term, once we - we want to have the New Landonline, having replaced Legacy by the start of 2025, like for like, and then really look to the enhancements and new features. And so, things like closer integration with Māori Land Court, Pātaka Whenua, going fully digital with survey plans, improving - you can see there - improving the subdivision process, increasing the digital data capture, and starting to look at, you know, how we can integrate with third parties, like councils, and others as well. So a bigger ambition there, beyond just - beyond the immediate needs of Landonline, for sure.
So now we're going to - into the second half - and our round of lightning talks, from - on the theme of Emergency Management. And just to quickly run through who we've got:
- So we've got Auckland Council - Jade Rutledge talking from Auckland Council - on the work that they did behind the scenes, red and yellow stickering of properties after those flood events at the start of the year.
- We have then got Chris Massey and Gerry Blair from GNS talking to work they’re doing assessing landslip risks to property across the country.
- And then Dr Shaun Awatere who has - we can see has joined us - from Manaaki Whenua Landcare Research talking to Māori Frameworks for Recovery.
- And then Rob - our own Rob Deakin from Toitu Te Whenua - on something that he's been working on, an emergency management data portal.
Yeah, so I'll, firstly, I'll pass to Jade from Auckland Council.
[Jade Rutledge begins presenting]
Hi guys, so I'll give you a bit of my background.
So I've been at Council for seven years, worked on projects all across our property data space. So that includes our Property Information Team, Property Files and LIM’s, and - as well as our digitisation projects, which includes digitization of our legacy records as well as our RPA teams.
So I'll just share my screen. I put together a little presentation I'll send through to Ben later. Just the high level of what we did for our response to the rapid building assessments and the emergencies we faced. So, just hang on a minute [starting up screen sharing].
Okay so I've included a few slides of images in here, just because it's important to bring back the context of what we actually face [referring to displayed image]. So this is one of our West communities, I believe this is Muriwai and the large land slips we faced out there.
So just to give a bit of context to the issue. In January we - Auckland Anniversary - we had our big flooding event. This was the first time in Auckland’s history that our State of Emergency was activated. A one in 200-year event, I don't know anyone in the region that wasn't affected.
So, what was it? We were responding to that event, dealing with those situations, and then two weeks later we had Cyclone Gabriel, and then in April we had the tornado, and then back in May again we had another flooding event. So for us it's been very busy – non-stop!
When we get kicked into the emergency response, like other councils, we refer back to the survey 123 App so our inspectors can go out and do their rapid building assessments. My involvement in the time, I was looking at business improvement for our Property File and LIM's Teams, so we kind of just raised the question, “Hey, what does this emergency response mean for our Property Files and LIM’s, you know, what are our obligations in declaring all this information to our ratepayers and public customers?”
So here's some more images, you can see on the top left houses completely destroyed, property submerged, roads completely collapsed. Driving around the region at the moment there's still roads - slips - roads closed, traffic management everywhere.
So I go back - now I refer back - to the placards and what they mean. So you've got your ‘reds’ which are your significant damage. That means our homeowners or our renters can no longer live in those properties at this point in time.
Our restricted access, our yellow placards. So it could be part of the building that was damaged, part of the land that was unstable. It could also be the case that our building inspectors couldn't do a detailed assessment at the time. These assessments are only done in 20 minutes, you know, and we may have Geotechnical Engineers and Building Inspectors, but, we may need a Eng – an Electrical Engineer – or, you know, those further expertise to come in and help out. So, if that's the case we’d probably order - issue - a ‘yellow’.
And I've put ‘red’ on here, I know Ben you mentioned - I put ‘white’ on here, I know you mentioned ‘red’ and ‘yellow’, so, maybe controversial, but our response was also to ‘white’. ‘White’ was low risk, you could go back into your house and live there if you needed to, but for our responsibility we had to declare that just in case five, ten years down the track there was another issue.
So, high level, natural disaster occurs, our regulatory teams combine and start those Rapid Building Assessments within the 48 hours. I noticed that, with a lot of the assessments I saw, we also have our colleagues from the wider councils come out, or pull resources together, as one team. Data is entered into the Survey123 App as assessments are completed, and we extract the data.
[Referring to slide] So this fourth box is probably where we had the biggest pain points. During an emergency where everything's so quick, data isn't always as accurate and helpful as it can be. You know, each area in New Zealand obviously has their slight quirks. Auckland’s no different, when we have inspectors come up from other councils, they don't necessarily know the details and the way our land and our property addresses and everything work. Then you've also got the challenges of, ah, when customers’ or NZ Post addresses might be slightly different to the ones we use in our systems.
So, our team spent quite a few weeks going through every single placard, every single data, mix-matching using GIS, trying to get that as much correct as possible. In the same time we were also sending out comms to our cust- homeowners – so, that - prioritise that with our ‘reds’ first, because they were the most urgent. Then the ‘yellows’, and then obviously ‘whites’ are least prioritised. And migrating all that information onto our SAP system.
So, not all of the staff at Council will have access to Survey123. The majority of us will need it for some aspect of our job. So we mention, you've got the regulatory teams, like Compliance and the Inspectors, and they need to refer back to that just to make sure that everything's signed off and it's back to code. Then you've got our Property File, our LIM’s, you know, Property Information Teams where we need to make sure, for our LIM’s and property files everything's correct. But then you've also got the Grants Team who needed to make sure they were providing the correct grants to the right teams after the Mayoral's relief was released. And Rates, we don't want to set debt collectors out to people that have been red stickered, so they need to know that kind of stuff.
So yeah, we worked on migrating all that information into our SAP system, with notes, inspectors details, photos of the damage if there were damage. That all appears on the property file and the LIM, and then at the moment we are in the ongoing maintenance stage. So, as I mentioned before, 20-minute assessments, it's fast paced, the data isn't always correct. So we are continuously correcting that. If a homeowner disagrees with us, you know, they can come, contact us, dispute it, we'll investigate it, and if we have made the wrong call we'll remove it - wipe it from the property - and then remediate all that kind of work.
So, I thought I'd cover the legal obligations for Property File and LIM’s. There are quite a few meetings back and forwards between our departments. Obviously once something gets put on a LIM people get quite defensive, and, reluctant to come forward and own up to the damage to the property. We do have an obligation under the LGOIMA Act, to disclose any information pertaining to the history of the property that we know about.
So, what we pretty much did, was, work with our legal teams and compiled wording to add into LIM’s. So this wording doesn't give the information that may breach privacy of the details, and the homeowners, but it will pretty much just say, “hey, we on this day, this property was issued a placard.” We've also added the status of the placard, so the placard says whether it's ‘open’ or it's ‘closed’. ‘Open’ means, obviously, it's open, they need to contact us if they want to get it closed. ‘Closed’ means it's resolved, Council's pretty much got no concerns, at this point in time.
Legal advice initially for us was that, while it was at low risk in the January-February period, the longer we waited, the higher the risk became. You know, you had people that, potential homeowners who were renting their properties not being truthful for their tenants, so we had an obligation to those customers as well. When we put information up we always have the ability to correct it, so if we did by chance put something up incorrectly, we could reissue another LIM or a property file, no problem.
That being said, in this space for property files and LIM’s there are still ongoing discussions. Particularly how we are treating the occupancies of properties and the likes of flats and apartments, because those become a lot more complex. With it - do we disclose that that one property was placarded to their neighbouring property, if they're attached? So we did get some external legal advice on that, and we're just going through that process at the moment.
So on this slide I have included some examples of what we've done. I won't open them unless you guys want to, but I'll send through the slide and you guys feel free to open them. But, pretty much, I mentioned the example of the LIM’s. The Rapid Building Assessment PDF we extracted from Survey123. Our documents are named in our Council naming standard. So that includes the notice number that was given to the incident, the type of event, so you can see in the two examples we've got the ‘Flood’ and then ‘Cyclone Gabriel’. And then we've also got the Property ID that it belongs to. So that's for our tracking purposes later on down the track. And then the actual contents of the document, I've written, you can see the assessor’s details, the date of the assessment, the building location details, which includes a little GPS snapshot to show the actual location of it, photos, detail assessments, and notes and further actions that need to be required – to be done.
The ‘Lessons learned’. Lots of these! ‘What went well’. We had never met - had - anything like this happen before. We learned very quickly we could collaborate really well together. Multiple teams involved. We had our Property Information, Property Data Teams, Regulatory, Legal, ICT, our Communications Team, and most importantly our GIS Teams. Without the help of our GIS Teams and all the extracts they could get from our GIS system, this process in terms of auditing and the data quality would have taken way longer than what it does. We would still be doing it right now.
So - and we were able to make some changes to the Survey123. So particularly having predictive address searches pop up, and manually identifying Property IDs.
‘What we could improve’, or what we would like to improve on, is automating some of our - survey PDFs being migrated into our record system. So we had to rely on our external vendor to help us do that, we had thousands and thousands of documents and there was no way we could do that manually. Even with the help of our robotics, because the robotics would have had to have development as well.
Automation, creation of our ‘I.N.’ - Incidents and Notices. So that is still very manual on our regulatory staff 's behalf, so we would like to automate that. Improve data cleansing in the earlier stages. And, the biggest thing for us was removing the reliance on spreadsheets when we extracted the data. When you rely on spreadsheets, you know, everyone can take a copy of it and then you have multiple copies flying around and someone might be working on the wrong copy, and it just gets really messy, and then everyone starts questioning the integrity of the data and what's the true copy.
So ‘Where to next’, these points on this slide are just for the aspect I worked on. Obviously this is a beast of a project, there’s still ongoing discussions in regards to rates, and property valuations, and updating our flooding maps, and all that kind of stuff. But for us our key focus is ongoing training for users. Extend the Property Search Project as we have a range of ‘service’ and ‘vanity’ addresses that are not being picked up currently. Endpoint automation of emails to update our customers. Our continual of process improvement. We want the FME server to view - to flag - properties that are occupancies for everyone. And then obviously the Compliance Monitoring Teams smoothing out their processing of the actual RBA’s.
I tried to keep that short and sweet, because as I mentioned it was a beast of a project, and I could - probably could be here all day talking about information. But, I have put my contact details out this slide, so when I send the package out you guys can contact me with further questions, if you don't have any right now.
Jade, we've got a question from Simon, thank you very much for that. Simon said, “As part of the RBA, was there a capture photo of the address on the front of the property. We found that very useful in the data validation stage, to ascertain details verified with a photo of the property frontage.” And he also said, “Thanks for your presentation and learning, as it's not an easy task!”
No, so I what I'll do, is, I will open up the RBA.
I'll see if I can find it.
Sweet, so hopefully you guys can see that example of the RBA we have up. So I've just blanked out the details there, but as you can see the building location, the GPS coordinates. Now, the issue with the GPS coordinates, we found is, it's not - it's not going to pinpoint the direction. It could be - it was - it could be six meter meters off, you know, it could be the neighbouring property. So, we really had to combine all our efforts and knowledge and pull across anything, whether it was the homeowner's details, and match it to the property ID. It was a very complex process!
So sometimes we do have images of the front of the property, to help us identify them. Sometimes we don't. So there's the map I mentioned. These are the images taken. So that's part of the process we could be improving. We've got the image of the flag, obviously. These don't always… When we're given this information, the inspector may have not been able to identify the property on while they're on site. So they may have just written a park name, or, you know, a road name and not the actual number. Or they could have written the number and the road name, but, in the likes of - I can use Queen Street in Auckland - we've got like four or five of them in different areas. So, it does become very difficult to narrow those down.
But yeah, that's kind of, the general viewing of what our assessments look like. You can see the comments where it says the power – what happened - the powerpoints were within reach and they've been taped off, and further evaluation is needed.
Thanks Jade, that's fantastic, and we’ve got another question from Susan. Susan said, “Thanks for your work, Jade. Impressed you has a document naming standard and great to hear about the important role of GIS. Keen to learn more about the address-mismatch and how to improve data quality for our key data improvement work here at Toitū Te Whenua.” So Susan will contact you directly to follow that up. And then finally Katy's just said, “Fantastic presentation, well done Jade, thank you very much.” And on behalf of all of us, thank you very much, that is really really interesting!
Well thank you for having me, and as I said my inbox is always open, so don't hesitate to reach out.
Yeah, thanks very much Jade, and your mention of issues around GPS accuracy has just sparked a thought - that I should mention our SouthPAN project, where we're launching a satellite-based augmentation system over the next few years to increase the accuracy and precision of GPS. So another use case there.
Now to Manaaki Whenua’s Dr Shaun Awatere. Hi Shaun, and, yeah, talking to Māori Frameworks for Recovery, and I'll hand over to you now. Ooh, can't hear you for some reason.
[Dr Shaun Awatere]
Let’s change the microphone.
Ah, there we go.
How’s that, good?
That’s good yeah.
[Shaun starts presenting]
Kia ora tatou. He uri aho, noo Te Tairawhiti, noo Te Tairawhiti, noo Ngati Porou. Ko Shaun Awatere ahau. I'm a researcher with Manaaki Whenua Landcare Research. A lifetime ago I got it used to be a GIS spatial analyst, so it's good to be here! But today I'll be talking about the place of Māori land and property data within decision making processes, particularly within a disaster recovery. So I'm going to try and answer some of the questions that Ben provided to me.
So to start off with, you know, data sets within a context for decision making processes. Data is used to inform, yeah, various types of decision-making processes, from climate change adaptation through to disaster recovery. So understanding the context is useful for understanding how data can be used to inform those decision-making processes.
So things to note include, you know, they're from an equity position, are we acknowledging the rights and interests of Māori for decision-making processes, especially within a te Tiriti context, which implores the Crown to acknowledge their responsibilities to the principles of partnership, protection, and participation.
So what this means is that policy solutions need to be critically analysed, and also, it also means that - is the policy approach a one-size-fits all approach? Is it based on assumptions like individual property rights? Or, the fact that markets are or the - the sense that markets are the most efficient approach for solving resource allocation issues? Or, you might ask, does the policy approach consider that collective well-being encompasses not only the tangible aspects of life? Not only in the whether there's economic production going on in an area. But does the policy approach acknowledge the intangible and spiritual as well?
Just tying-in from the previous conversation. What concerns me is that if we do focus primarily on primary production, areas where there's large amounts of population, those who are more vulnerable, particularly in Te Tairawhiti, will be left off any of those analyses.
So our - if our policy approach… Yeah if that's the case, then I think our policy approach will differ from on that - from one that's primarily technocratic focused.
So another kaupapa or topic to consider is, how can the property data help inform the process for decision-making or resource allocation, rather than leading the decision-making process or resource allocation?
So if we take the - the latter approach - then technocratic solutions, often with their focus on perfect data can lead us to imperfect outcomes.
Post-Cyclone Gabriel agencies, they waited until the end, until all the data been collected on the damage to the properties before making a region-wide risk assessment to inform any buyouts or repairs to houses. And that just took too long. You know whanau were still living in those dam- flood-damaged houses, and were often told to relocate to other houses that were overcrowded anyway.
So the starting point for these types of - technocratic analysts often overlook the lived experiences of vulnerable populations. You know there's a lack of empathy towards people who are suffering from the long-winded bureaucratic processes. The data will never be perfect and therefore we need to rely on principles like equity and expert judgment, utilising the best set of data that we have available at the time to make decisions in a timely manner.
So there is a role for agencies within property data to support hapu and iwi institutions that are wanting to adapt to climate change, or that maybe wanting to think about relocating their communities from flood prone river valley areas and erosion prone areas, for example. And areas that might be impacted by sea level rise.
So support could be provided to those hapu and iwi in a number of ways, from the provision of data sets, or providing access to data sets through to basic analysis, such as overlaying the property info with flood mapping tools. Or making - making the hazard mapping portals, and all those flood tools and so on, more accessible for community-based approaches.
At the moment it’s set up for, primarily, for individual property owners, because that's the focus of the clients that the tools have been set up for. But what might be useful is for, how do you set up solutions or approaches for communities that are dealing with having to think about relocating the marae, relocating the kohanga and the urupa.
So it’d be good to think about, what are some of the - how you can as analysts think about more bespoke solutions for a community. And helping support someone who might be interested in accessing individual titles, along with parcels that are subject to Tu Tere Whenua Māori Land Act, from the Pātaka Whenua portal as well, along with parcels that might be subject to the Māori Reserves Act, so that's marae and urupa.
So you need someone who can navigate amongst all those different databases to provide useful information for - for the - the community analysts who are helping to support whanau and hapu through these difficult times, around relocation or climate change adaptation. So yeah oftentimes, during these types of Kaupapa or issues, it's left up to the community navigator or the champion to do - to do not only the property data and analysis, but that's also alongside the other workload of governance and decision making on behalf of the communities.
So it's really important that as, as GIS analysts that you provide as much support and awhi to those community navigators as possible. So you could tailor the portals to help those with the technical skills to access their data, but there still needs to be capability building processes to help people utilise and access those tools, and importantly help them understand what the data is saying.
So how do we define and map local communities, the people, the land and properties within those boundaries. As a starting point, Te Puni Kokiri’s got a great map of iwi boundaries, and I think this is a good place to kind of begin conversations with hapu and iwi about – about, yeah, how do you actually work with the data that they've got. And then in terms of Māori data, a good place to start as a conversation with hapu and iwi is through the te Kahui Raraunga website.
So te Kahui Raraunga has a report which describes the Māori data governance model that has been designed by Māori data experts for use across the Aotearoa – across the public service within Aotearoa. So the report says that Māori data is a taonga that requires culturally grounded models of protection and care, and the model provides guidance for system-wide governance of Māori data consistent with the government's responsibilities under Te Tiriti o Waitangi. So I think te Kahui Raraunga the Māori data governance model is probably a useful starting point for agencies to – to think about and explore the conversations around Māori data and the data sets that they hold.
And then one final question that Ben asked was - or statement was – Māori land and property data is often fragmented and relatively inaccessible, holding back effective te Tiriti decision-making processes. Central and local government often want to do the right thing but can’t empower communities that they don't know about.
So the question around how to empower my communities utilising property data. So once again, yeah, framing is everything. Rather than ask how Māori data can fit into the tools, the databases, the portals and so on. Rather ask how the tools can help support the processes that will empower communities to realise their aspirations. And often their aspirations are quite different from individual property owners, the aspirations are around more social well-being, cultural well-being, as well as economic well-being as well, but it's all part of that mix.
So this might mean that acknowledging that databases do have their limitations, and often that's the relationships that the institutions develop with communities that matter most. So in practice this means officials being flexible and adaptable, and I know as planners those are words that you don't like to hear! But in practice, you know, it means being flexible and adaptable, and starting points for addressing an issue will often be from different positions.
For the usual client the request would be within the framing of the remit for the database because the database has been set up to achieve their outcomes. That is that's based on a set of assumptions regarding transferability of individual property titles to support a market exchange.
However when we’re dealing with communal property, there's a different set of priorities, like intergenerational sustainability of those assets within limitations of natural ecosystems, and acknowledging that the impact of colonisation has had on the capabilities of vulnerable populations, like indigenous people.
So this means more bespoke approaches might need to be required. As an example - an example could include working with a regional mana whenua groups or standing committees similar to Toitu Tairawhiti and that's made up of the CE’s of the four iwi in te Tairawhiti - oh four iwi in te Tairawhiti – Ngaati Porou, Rongo Whakaata, Te Aitanga a Māhaki, and Ngāi Tāmanuhiri.
Toitu Tairawhiti was formalised during the COVID response and they've formed a proactive leadership group to support whanau living in te Tairawhiti, and are currently addressing the recovery effort in their rohe for responding to Cyclone Gabriel.
So the types of things that they've been doing has been providing portable temporary homes for those impacted by the flooding. So it might be good to have a conversation with a roopu like Toitu Tairawhiti to identify where the opportunities are to support their initiatives, especially around housing.
Alternatively you might be good to look at how the data that is used to support Regional Emergency Management is provided to those community organisers, especially when you have those devolved responsibilities from a command controller to each of those Bays throughout the te Tairawhiti – Tolaga Bay, Tokomaru Bay, Ruatoria.
So instead of going directly to the command controller, think about how you might be able to support those regional CDEM responses – sorry, not the regional, but the local CDEM responses or the community organisers.
So in summary. Yeah national data sets have their place and their role in resource management and disaster recovery, especially when the information is up to date and complete. However when dealing with local knowledge, particularly Māori data, you know, often it's locally specific, context-specific, incomplete, and asymmetric in form.
So that means that you have to kind of come up with the bespoke solutions for decision-making processes. And there - you ought to seek out data, that, and - and also, it's good to support approaches that will empower hapu and iwi in those decision-making processes. So kia ora, everyone, thank you for providing time for me to share this whaakaro.
Thank you, Dr Shaun, that was really, I think, quite a powerful perspective there. I see Richard Deakin’s just quickly got his hand up for a question, and we might just take that one before, with - in the interest of time, just passing over to Rob to present.
Sorry, no question, I was just going to applaud him.
Oh, you hit the wrong button.
Did a ‘hand up’ instead of an ‘applaud’.
Very well done, Shaun, thank you.
Okay, well thanks again, Shaun - Dr Shaun – and, yeah, we'll hand over now to Rob Deakin, Manager Resilience here at Toitu Te Whenua. Going to be talking about our Emergency Management data portal that we've been working on.
[Rob Deakin starts presenting]
Kia ora tatou. I'm hoping you can see the entry slide for a scintillating PowerPoint presentation! So yes, thanks for the opportunity, just to provide an update on some work we've been doing over the last sort of 18 months. And it's looking at how we can improve the sharing of data to support Emergency Management, so you know really useful to follow on some of the points that Jade and Chris and Shaun have just made in their presentations particularly.
And this was really born out of, I guess, on the back of the COVID response, in that we recognised, yet again, how much room for improvement there is in our ability to share data and information to support decision-making. Particularly around response and recovery, but also more generally in terms of sharing information to better support all phases of emergency management.
So, you know, typically the problems are not knowing who has what information, or how to get hold of it, and there's a whole bunch of things that don't seem to work well, and each time we get a new disaster or emergency response we tend to be reinventing the wheel.
So, as a group of agencies – Toitu Te Whenua, and National Emergency Management Agency, Police, FENZ, Defence Force, and Stats sat down in the wake of the COVID response to try and identify what improvements we could make in that space.
You know, and typically the - we've already heard that the background of this is it's - during a response particularly - it's hard to find and analyse information, you - a lot of the time you're relying on existing relationships to know who's got what and how to get hold of it, and if you don't have those relationships in the - with those organisations or individuals then you're stuffed.
It’s often unclear which data is authoritative, and because of all of these things people are duplicating effort and time, at a time where, you know, time is of the essence.
So, we recognise that one of the things we could work on as a group was to try and solve this problem of how to make this information more accessible and visible. And it's not a problem that's unique to New Zealand, it's a problem that many nations are struggling with. And we've done some work with the UN Committee of Experts on Global Geospatial Information Management over the last few years. And they have a strategic framework on the better use of geospatial information in - for services and disasters. And then this is the goal of that framework, and really it's to enable the sharing of information, to make it available for timely well-coordinated decision-making across all 4 Rs of the emergency management cycle.
So it's a complex problem, it's not one that New Zealand alone is wrestling with, and there is just so much scope for improvement. And again that was highlighted through the recent cyclone response as well. So it's just a reminder that we're not as well set up as we should be.
So 18 months ago, LINZ and NEMA put a project team together which was really to try and get to the heart of, I guess, validating some of the assumptions we thought about, that we had with the problem, and then to come up with a number of solutions that might solve problems to this. And we took a very user-centric approach to this.
So we went into a research phase, where we spoke to a number of different individuals from a whole range of organisations over a four- or five-week period. So we did a lot of interviews with sort of 24, 25, 26 people involved in different aspects of emergency management, from government to community to NGO’s, and heard their stories really about what the problems they were experiencing in trying to do the work that they're doing, and the roles that they need to fulfil during that.
We took the findings from those interviews to come up with some insights about what solutions could look like, and really to test some of the assumptions that we were making with that. And really just going back to the point that Shaun made, what we - one of the key things that we found was a lot of the solutions that have been designed in the past are really for government and local government. So, really excluding some of those community groups, those NGO’s, who need the same sort of information to organise their own responses, but who aren't well catered-for at the moment.
So, at the heart of it, the solution we've been looking at is trying to make data easier to find, to ensure that the data that is available is trusted, people know where to go to, to get hold of the information before they need it, because running around during your response is – so, you're just eating up valuable time. And solutions that make it easy for organisations, or individuals who've got valuable data, to make it easy to share. And a key part of it is it's not just standing up a piece of technology but it's really working with existing user communities to make them aware of what is available and to embed the use of some of these tools in the ways that they work.
So, we came up with a solution in five parts that we've been looking at over the last 12 months or so. Part of that is a web-based catalogue, to enable a - to provide that place to go to find out about if information does exist, and particularly geospatial information. Within that we've been looking at how you can curate data collections thematically within that catalogue, and that might be by event, or by region, or by iwi boundary, or, you know, or by ‘lifeline’-type. You know, what categories do people really want to come to from their point of interest, so they can grab that information quickly, because it relates to the problem that they're trying to solve in their particular situation.
A key thing is how do we put data sharing agreements in place before the poo hits the fan, because a lot of time is taken up trying to negotiate access during a response, and that's another waste of time. And then looking at how you can standardise data formats so that information that's collected in Northland is to the same format as that being collected in Southland, so that you can bring the things together so you can understand the larger picture by making it easy to pull those information sources together. And critically, how do we work with existing communities of practice to make them aware of the tools that are available, also to get them to iteratively help us design practical solutions to the way they want to work.
Well, so, very quickly over the last - I guess - we went into a second stage of research that the - focused circling back on - particularly on iwi and Māori user cases and data sharing agreements, with the lifelines communities, and we've looked at developing what prototype solutions might look like.
And then over the last six months we've been taking that iterative design process and rolling out an actual working prototype that's now available online. And this stage the - really the focus of the last six months of work have really been to take existing data catalogues to see how we can pull that information through into a tool.
Can we use a commercial off-the-shelf technical solution to host that catalogue? So the ArcGIS Online platform is something that LINZ and many others have access to, so what of the functionality that we've heard of, from users that they want, can be provided without the need for any sort of really expensive software design or bespoke solution design.
And particularly how can we test some of the aspects around getting metadata in and developing search functions that enable people to work through the catalogue. And, how do we engage with - and we've engaged with a bunch of users again through this design process.
Given where we are with time, I'm not going to go into a system demo or presentation, but if people want to look at what we've done I'm more than happy to set up another separate session to do that.
What I will say, is that over the next 12 months we're going to continue prototype typing the solution. And the areas we're going to focus on having tested out the technology platform and establishing that that's pretty much fit for purpose. We're going to be looking at the catalogue curation, which is the getting the that vital content in place. How does this initiative fit with other initiatives that are operating across government? Because we know there are several of those around climate data, around environmental monitoring and reporting, around property data. So where does this particular initiative fit with that - those? Because we don't want to duplicate effort.
We know that a big piece of this work, making it successful, is around the process and governance of the monitoring and use of the catalogue. We'll be continuing to go through iterative design processes with different user types. So, we're looking at particular use cases. So they might be with lifeline utilities, they might be with CDEM groups. But are there other communities that are keen to work with us over the next 12 months as we continue with this prototyping?
And if any of you are out there from organisations who are interested in sharing data and information that you have, then please let us know. If you're interested in joining the project as a tester, we'd be really interested to hear from you, so that we're in a position to get your feedback on what we're doing.
And that is pretty much it in five or so minutes! So, you know, any questions in the chat and we can follow up with them. But also if you are interested in being involved in the next phase of development, and if you do have data that you would like us to make available to others, then please get in contact, and we can – would love to work with you over the next 12 months. To see where we can get to.
Okay, thanks a lot Rob, that was really interesting and we might, yeah, that's a good thought, we might put feelers out to see if people are keen for a demo session separately?
It sounds like that'd be something good to do. And, yeah, we'll send these out with the notes obviously - this slide pack. So, if anyone else that wasn't able to attend today is interested to help, they can hopefully also get in touch.
Yeah, so, thank you everyone – well - thank you again to Rob, and thank you everyone for joining for the last couple of hours, and I know bellies are rumbling. So we will wrap up there, without much further ado. We can expect to see the slide pack and notes come out in time, and contact information for each of the presenters as well, where they're happy for that to be shared, and to get in touch. Yeah, thanks again everybody, have a great day, and we'll share the recording for this as well.
Okay, thanks guys.