Our Submission: Changes to Part 3 of the FOI Guidelines

We have made a submission to the Office of the Australian Information Commissioner (OAIC), who were consulting on changes they proposed to make to Part 3 — Processing and deciding FOI requests of their FOI Guidelines.

The FOI Guidelines are detail how the Information Commissioner expects authorities to handle FOI requests. We’ve shared our feedback to the OAIC below:

The OpenAustralia Foundation (OpenAustralia) would like to thank the Office of the Australian Information Commissioner (OAIC) for the opportunity to provide comments into planned updates to Part 3 of the FOI Guidelines: Processing and deciding on FOI requests.

OpenAustralia is a public digital online library making government, public sector and political information freely and easily available for the benefit of all Australians. Our services also encourage and enable people to participate directly in the political process on a local, community and national level. We are strictly non-partisan, nor are we affiliated with any political party. We are simply passionate about making our democracy work.

OpenAustralia hosts Right to Know, a digital library that allows people to make and find Freedom of Information requests. We bring our unique perspective over the last 14 years, having helped people make over 12,000 requests for information to Commonwealth, State and Territory governments.

We would like to acknowledge the work done by the OAIC to create the Self-Assessment tool for agencies.

Our comments are made in the context of our experience with public interest FOI requests. They are also largely confined to changes highlighted on the OAIC consultation page, due to challenges identifying changes between the current and proposed versions, which we have raised separately with the OAIC. 

We appreciate the OAIC’s commitment to review better ways for stakeholders to identify, review and engage with future updates.

Processing requests for Personal vs. Non-Personal information (from the point of view of the Applicant).

Paragraph 3.16 of the FOI Guidelines makes it clear that the FOI Act does not require an applicant to prove their identity, nor does it prevent a person from using a pseudonym, unless they are requesting their own information.

The FOI Guidelines should consider clearly outlining each scenario (Personal vs. Non-Personal) distinctly and independently. They should never be conflated.

We reviewed the FOI application forms for the below agencies:

  • Department of Home Affairs
  • Services Australia
  • Department of Defence
  • Department of Health
  • AUSTRAC
  • Department of Climate Change, Energy, the Environment and Water
  • Australian Taxation Office

We found that the application forms already published by these agencies request the below information:

  • Full Name
  • DOB
  • Country of Birth / Country of Citizenship
  • Phone Number
  • Organisation Details including ABN
  • Individual identifiers (such as CRN, TFN etc)

None of these details are required for a public interest FOI request. 


In a number of cases, the wording on the website does not explain clearly (or at all) that applicants do not need to use a form to make an FOI request. An applicant unfamiliar with the FOI process could easily assume that providing this information via that form is required rather than optional. 

Where an applicant is seeking non-personal information, agencies must be mindful of their privacy obligations to the applicant. Agencies have an obligation under various laws to only request the minimum information required to process a request. Agencies should proactively advise applicants that, if they are making a request for non-personal information, they can do so anonymously, and without providing any of their personal information.

There are very good reasons why applicants may not wish to provide their personal information to the government, and the reason that someone may or may not provide their personal information when making a request for non-personal information must not be considered by agencies when making decisions about their requests.

Where forms, and the systems that host them, make the default position one of requesting more of the applicant’s personal information than is needed, this does not respect the applicant’s privacy.

Easy to understand

While reviewing these guidelines we observed that the documents are difficult to understand for people who do not have a deep understanding of the FOI process.

While we acknowledge that this document is primarily aimed at FOI practitioners, applicants and members of the public need to be able to understand the guidelines to be confident that decision makers have complied with them. More work is needed to make these guidelines easily readable and understandable to a general audience.

The guidelines also refer to sample freedom of information notices on the OAIC website. We note that these notices, while helpful, contain excessive legalese which easily confuse people who are not familiar with the legal aspects of making an FOI request. They also do not seem to be optimised for use in an email. Given that these samples are intended to be used with applicants, research should be conducted with applicants to ensure that these samples are “fit for purpose”. We would be very glad to work with the OAIC separate to this consultation to help improve these.

Paragraph specific comments

We have included our comments on a number of paragraphs below.

Paragraph 3.18

Further information on the potential and current risks and problems with “artificially generated” requests is needed before a solution is included here, and should be considered separately.

Without clearly identifying the risks posed, it is unclear how encouraging the use of forms would help. What it would do is require the applicant to provide more personal information than required by law.

We are concerned there is a greater harm of agencies requesting an unreasonable amount of personal information to process a request. There is also a greater likelihood of this happening – in fact, it is already happening. This is inconsistent with s 15(2) of the FOI Act and APP 3.

Paragraph 3.29 (note 21)

There’s an opportunity to update note 21 in paragraph 3.29. Currently, this says:

The OpenAustralia Foundation Ltd, a registered charity, has developed a website (www.righttoknow.org.au) that automates the sending of FOI requests to agencies and ministers and automatically publishes all correspondence between the FOI applicant and the agency or minister. Agencies and ministers should consider whether the FOI request involves personal information or business information when dealing with public internet platforms facilitating FOI requests.

As the subject of this note, would appreciate this note being updated to something similar to the below:

The OpenAustralia Foundation Ltd, a registered charity, operates Right to Know (www.righttoknow.org.au), a service that automates the sending of public interest FOI requests to agencies and ministers. Right to Know automatically publishes all correspondence between applicants and the agency or minister. Right to Know does not allow people to make FOI requests for their own or someone else’s personal information, They encourage FOI officers to contact their team to discuss any request that may result in disclosure of personal information.

Our proposed revision reflects that our service is designed to promote the objectives of the FOI Act, and to facilitate information to the world at large. It also stresses that Right to Know, as a service, does not handle requests for personal information. We encourage anyone (including agencies) to contact our team directly to discuss any request.

We note that the former Information Commissioner has previously made it clear that requests from Right to Know are valid FOI requests. We submit that should be made clearer, given the acknowledgement from the OAIC that agencies prefer using their own online forms. Doing so may collect more personal information about the applicant than is required under the Act.

Paragraph 3.30

We reiterate and emphasise the concerns we have raised in response to paragraph 3.18 with regards to public interest requests. 

Paragraph 3.32 – 3.42

We suggest taking this opportunity to amend the title of this paragraph from Assisting an applicant to Take reasonable steps to assist an applicant

This change makes the proactive requirement for agencies to take all reasonable steps to assist applicants clearer and more accurately describes this part of the guidelines.

Paragraph 3.145 – Advising the applicant of steps required to find documents

We support the requirement for agencies to provide more detail on the steps they have taken to locate documents. For example, in this response from the Department of Defence, the decision maker provided extensive detail on the searches conducted (including search terms), the number of results, the fact that consultation was identified, the total number of hours to process and how they came to that realisation. Providing this level of detail gives applicants sufficient detail to refine their request, if they choose to do so, and provides evidence of sufficient searches in the event an applicant wishes to challenge a decision.

Paragraph 3.154

As the consultation page articulates (emphasis added), it will generally only be appropriate to delete public servants’ names and contact details as irrelevant under s 22 of the FOI Act if the FOI applicant clearly and explicitly states that they do not require that information.

While we understand that agencies may wish to give the option for the applicant to exclude staff details, agencies should not ask this question when the FOI request is submitted. Only once the documents have been identified should applicants be asked to anticipate the relevance of the material. Use of pre-emptive, leading web-form checkboxes or template requests asking applicants to consider the exclusion of public servants personal information as a condition of submission is not acceptable.

We support the agency consulting with the applicant in the event that they identify (or are likely to identify) public servants’ personal information in response to a request. It makes sense to advise the applicant what information has been found and if those details are considered relevant. For example, It may be that a name is important but direct contact details are not. In another example, the personal login details of a public servant may be the only important information in the case where a request is regarding access or changes to material..

Paragraph 3.155

Applicants should not suffer detriment (including increased charges) if an agency requests that names and contact details of public servants be removed under s.22. Further, agencies should not charge for consultation time required between employees of the agency.

Paragraph 3.160

We are supportive of the requirement for an agency or minister to mark the reasons for redaction within the document. 

Applicants would benefit from a schedule of documents which detail the specific information that has been redacted, or alternatively the information redacted within the document itself. For example, instead of simply saying “s22” it might be easier to say “s22 – Staff Name” or “s22 – Phone Number”.

Agencies should be mindful that people making requests under the FOI Act may not understand the various sections of the law or how it interacts. OpenAustralia Foundation receives many queries from people seeking to understand how the law may apply to their request, or how they may ask for information. The use of legislation should be discouraged in favour of “plain english” decisions, which extends to redactions.

Paragraph 3.165 – 3.174

The guidelines should encourage agencies to make clear to applicants where a search has not been conducted for any documents that fall under section 25 of the FOI Act.

Paragraph 3.176

This guidance is incomprehensible for the general public, and even we who are somewhat familiar with the FOI process. 

Paragraph 3.210

While agencies should be allowed to ask if applicants are happy to share their contact information, agencies should not be able to dictate telephone contact as a method of communication. There are several reasons for this, including accessibility (such as a person can’t use a telephone) as well as privacy concerns (such as an applicant not wanting to disclose their identity). Applicants should not be disadvantaged by a desire to use their preferred communication method.

Paragraph 3.123 – 3.214

Anecdotal evidence suggests applicants value proactive suggestions on ways that a request could be improved, such as by providing a date range or by limiting the scope of documents. Providing this information proactively supports the understanding in paragraph 3.178 (An FOI applicant may not know exactly what documents exist and may describe a class of documents).

Paragraph 3.215 – 3.216

We appreciate the legislation requires an applicant to provide written notice to the agency as part of the request consultation process. That said, agencies will be in a better position to correctly articulate any verbal agreement following a phone call. 

Where an applicant is making a request via email, agencies should be encouraged to follow up  verbal agreements in writing, and should accept an affirmative response from the applicant as written notice.

Alternatively, if the above is inconsistent with the law, agencies should be required to assist the applicant to modify the scope of the request after verbal agreement.

Paragraph 3.218 – 3.225

We fully support paragraph 3.223 which encourages agencies to consult applicants about release on a flexible and agreed basis. 

Agencies should not seek to unfairly discriminate or disadvantage an applicant (including by charging an applicant) for the time taken to consult the applicant.

We are generally supportive of the remaining guidelines surrounding requests under s 17 of the FOI Act. 

Paragraph 3.249

We support the requirement for applicants to be informed that a decision is “deemed refused” at the earliest possible opportunity.  If an applicant makes a request for internal review, the agency should be required to advise the applicant of the IC review process. The agency should also advise the applicant if it intends to continue processing the request despite a decision being deemed refused.

Some applicants may not be aware that their application is “deemed refused”, and are more likely to describe the application as “delayed” or “not processed in time”. 

Processing timeframes should be clearly communicated at all stages of a request, including at the start and when the timeframe has changed (for example, due to consultation). Agencies should be required to take all reasonable steps to prevent a deemed refusal, including by requesting an extension from the applicant or from the Information Commissioner.

Paragraph 3.288

We do not generally support the use of SIGBOX or other secure file sharing services for non-personal public requests for information. Applicants should not be required to provide a username and a password to a 3rd party service in order to access their documents, nor should applicants be required to access websites or file sharing services for documents or notices.

There are several risks with secure file sharing services, including the intentional or inadvertent excessive collection of personal information, as well as the security risks associated with requiring people to provide a password. Further, this is again requesting more information than the FOI requires the applicant to provide in order to make a request.

We look forward to seeing the new and improved guidelines!

Posted in RightToKnow.org.au | Tagged , , | Leave a comment

Why you can’t request external reviews via Right to Know

We’ve recently updated the page for the Office of the Australian Information Commissioner, also known as the OAIC, to make it clear people can’t use Right to Know to make FOI review applications. We’ve done this after seeing an increase in the number of external review requests and complaints being made via Right to Know.

The OAIC is a unique authority. They process external reviews and complaints for FOI, as well as privacy complaints. You can also make a request directly to them for information. 

Right to Know was not built to handle reviews, and can’t tell the difference between a review and a request, which is why it’s not suitable for external reviews.

We understand this might be frustrating to some, however there’s a few reasons for this:

  • When you make a request via Right to Know your request is automatically sent to the FOI team. The OAIC has a different team for handling reviews and complaints. This means your review will need to be forwarded, increasing the risk it will be delayed or even lost.
  • The logic that powers Right to Know assumes that you’re making a FOI request, not an external review. This logic is used to set the status (for example, “long overdue”), as well as changes behind the scenes that prevent spam. The impact of this can range from confusion on the status (a FOI review may not be “long overdue”) all the way to emails not being delivered properly.
  • Right to Know uses a unique email address to track each request on the site. When you make an external review, another email address is created (1 for the original request, and 1 for the review). This makes it harder for the authority and the OAIC to know what email address they should respond to. It also results in emails ending up on multiple requests on the site, which causes confusion for everyone and can take a lot of time to clean up.

We understand anonymity is one of the many benefits that people value when making their requests on Right to Know. If you need to make an external review request and would like to protect your anonymity, consider:

  • Using an email address not linked to your normal email address or real name.
  • If you use a pseudonym on Right to Know, use the same pseudonym when making your review request to the OAIC.
  • Keep your original request updated by adding annotations to your original request. Consider putting the OAIC reference number in your first annotation, so they know you are the person who made the review request.

Would you find it useful to be able to make external review requests directly in Right to Know? Let us know by completing this quick survey.

Posted in RightToKnow.org.au | Tagged | Leave a comment

Do the teal independents always vote with the Greens?

We have been alerted to some political advertising that claims to show that teal independents are voting with the Greens Party over 70% of the time.

We cannot say for sure whether this is true or not, and likely no one else can either because that sort of information is not currently recorded anywhere, officially or otherwise.

Currently, the only votes that are officially recorded in any detail are called divisions, which are formal votes that involve our representatives moving to one side of the chamber or the other and having their names officially recorded. These types of votes happen far less often than votes “on the voices”, which is when our representatives shout “aye!” or “no!” and the loudest side wins. “On the voices” votes are only recorded by whether they pass or not, so there is no official record for how each individual politician voted in them or whether they were even present at the time.

This means that, if true, the proposed 70%+ agreement figure between teal independents and the Greens Party must be based on division data.

So the question then becomes: are the teal independents voting with the Greens over 70% of the time during divisions?

To answer this question, we can look at a particular teal independent MP’s voting record on They Vote for You and select the “Compare their voting record with someone else’s” option. As well as providing a number value for the level of agreement with other MPs, these comparison pages also provide a list of the policy areas on which the different representatives tend to agree or disagree.

Alternatively, since the question relates to the Greens Party, we can pick a single Greens MP to make the comparison. Here, we’ve chosen Greens MP for Ryan Elizabeth Watson-Brown because at the time of writing this article she had the highest attendance record of the Greens MPs, had never rebelled against her party and entered the House of Representatives at the same time as several of the teal independents in 2022.

To help give some context to this comparison, we’ve also included how Ms Watson-Brown’s record compares to Prime Minister Anthony Albanese’s and Opposition Leader Peter Dutton’s.

At the time of writing this article, this comparison showed that the Greens Party were actually voting more consistently with the Labor Party during divisions than with the teal independents in the House of Representatives. 

But before we form the conclusion that the Labor Party and Greens are always agreeing with each other, it’s worth looking at the situation in the Senate. For example, here’s how Greens Senator David Shoebridge was voting compared to Leader of the Government in the Senate Penny Wong (Labor) and Leader of the Opposition in the Senate Michaelia Cash (Liberal):

As you can see, in the Senate there seems to be significantly less agreement between the Greens and Labor and slightly more agreement between the Greens and Coalition.

So what does all this mean? 

While different teal independents do seem to have voted with the Greens Party between 66%-76% of the time during divisions in the House of Representatives at the time of writing this article, this figure actually appears to be smaller than or roughly equivalent to the level of agreement between the Greens and Labor. However, this should not be taken as evidence that the Greens and Labor are always voting together, as the two parties’ agreement levels are significantly lower in the Senate.

In other words, while comparing the level of agreement between representatives can be useful, we must remember that it only tells part of the story and may be a wildly inaccurate representation of what’s really going on. So we should always use these figures cautiously.

Posted in They Vote For You | Tagged , , , , , , | Leave a comment

The increase of Yimbyism on Planning Alerts

Australia’s most recent housing turf war has found its way into the comments section of Planning Alerts. A look into the Yimby movement and its presence on the site. 

In 2024, we observed a new trend on our site. Users supporting individual Development Applications were commenting not only how a particular application would benefit their local area, but also how it would be a salve to the housing crisis more generally. 

These comments seemed to be largely coming from users who either self-identified or were identified by other users as “Yimbys”. 

Around the same time, we received a couple of complaints from users that these Yimbys were overtaking Planning Alerts or even using bots to spam the site. 

Curious into what was going on, we decided to dig a bit deeper. 

Yimbyism or “Yes-In-My-Backyard” is a global pro-development movement which calls for the creation of more housing stock as a solution to housing unaffordability. The movement was established as a response to Nimbyism or “Not in My Backyard”, a blanket term referring to community advocates opposed to development in their suburb. 

Policies vary from group to group, but common Yimby themes include a push for urban design that supports social infrastructure, public transportation, and pedestrian safety. The movement is often associated with millennials concerned with rising cost of living and unaffordable housing prices and seeking change in their local community. 

In Australia, the movement has gained popularity in the last two years, emerging as as two Australian academics put it, out of the “the post-pandemic housing crisis and policy debate”, as housing prices continue to rise. 

Leading the Australian Yimby movement is the Abundant Housing Network, an alliance of four state-based groups working to create “sustainable, liveable and affordable” cities. 

Policy focuses for the network include the removal of zoning and heritage laws that prohibit construction, creation of “gentle density” development in suburban areas and an increase of urban sustainability in town planning. The exact political ideology of Yimbyism remains ambiguous, with members from all sides of politics. While often touted as progressive or left-leaning, critics have been quick to point out the ideologies’ foundations in right-wing economics and policies. As one American housing group described it, “From the get-go, YIMBYs embraced trickle-down economics or what’s now called “trickle-down housing” policy.”

The movement has also been criticised for an overemphasis of housing stock as a solution to housing insecurity and the movement’s ongoing connections to private property industry, which can be at odds with the Yimby’s local, grassroots persona.

So, how prevalent is Yimby & Nimby discourse on Planning Alerts?

We identified a combined 70 mentions of the terms “yimby” and “nimby”, here’s some trends we noticed: 

  • The terms have been used on Planning Alerts since 2014. The earliest example we found was in April that year. 
  • Overall usage of both terms remains low on the site. There has been an uptick in the last few years with at least 15 mentions of the terms on the site in 2021, and 10 mentions in 2024.  
  • ‘Nimby’ was the most used term, accounting for 94% of all identified mentions (66). The term was used to criticise other user’s submissions or to defend and explain their own standpoint. It was also often used to clarify a standpoint: “ E.g. “I’m not a Nimby but”. 
  • The remaining four references were made either by users self-identifying as ‘Yimby (1)’ or by others criticising the ‘Yimby squad’ for being uncritical in their support of proposed developments (3)
The earliest example found on Planning Alerts, 2014

In the majority of cases, the terms were referenced as a general comment made by a user discussing the Yimby/Nimby perspectives towards a developmental application without particular reference to another individual user (81%). 

7% were directed comments, in which a user responded or argued with another user’s perspective on a proposed development.

Only 11% of the total references involved individual users self-identifying as belonging in either camp, and again, this was skewed towards those identifying Nimby (7) as opposed to Yimby (1).

Outside of our sample, we found evidence of Yimby groups using Planning Alerts to organise and campaign, using social media posts such as this one by Sydney YIMBY to encourage users to make a submission on the site. 

Screensnap of post from Sydney YIMBY facebook group asking users to make a submission on a proposed DA in the suburb of Haberfield.
Sydney YIMBY group using social media to encourage submissions

We also found comments published to social platforms encouraging members to speak out in support of specific Development Applications or against anti-development submissions published to Planning Alerts.

We found a number of users sharing the same name as registered Yimby members who have submitted comments on Development Alerts.Some of these users commented across multiple local council areas and even interstate. We cannot confirm these users are Yimby members, as Planning Alerts does not collect data outside of the registered name and email address.

We did not find any evidence that Yimby comments were the product of automated bots. Suspected Yimby submissions were tailored to individual development applications, and while some users were regular commenters, there was nothing extraordinary in the volume of submissions that would suggest an automated process.

Thoughts & conclusions

Planning Alerts is not invested in vetting Yimby or Nimby ideology. As an open source community platform that aims to enable residents to participate in their local development processes, we try to keep moderation of comments to a minimum.

This being said, our feedback section is designed as an easy interface to submit comments to the local council, not as a public comment section to debate other users. We caution users from using the forum as a platform to argue with other users as this can discourage people from having their say. 

Comments that are abusive, unlawful or harassing — in other words, where people are going out of their way to harm another user — will not be tolerated as part of the feedback and you should report them. 

We also remind users that the Planning Alerts is intended for commenting in your local area, not across multiple jurisdictions.

As an established movement, Yimbyism does have organisational power compared with individual community advocates active in campaigning, resourcing and engaging with other avenues of community feedback (for example: here and here). 

The movement has supporters from the private property market, pro-development politicians and some state governments and while we did not find evidence that Yimby members with more established connections to developers and planners are active on the site, we want to remind users that this site local opinions, not a campaigning outlet for groups that already have a platform. 

We will continue to monitor comments to ensure this remains the case as we want to ensure our platform remains a platform for genuine community feedback and not coordinated efforts to influence development outcomes or to attack other community members.

Posted in PlanningAlerts.org.au, research | Tagged , , , , , , , , , | Leave a comment

Introducing the biggest Planning Alerts redesign in fourteen years

We originally launched Planning Alerts in October 2009. That feels like a lifetime ago.

In that time all sorts of features have been added, there have been visual redesigns, and we’ve introduced many little improvements and fixes. Yet nothing we have done before feels as substantial and meaningful as what we’re launching today. 

On the face of it, the changes you see may seem like a visual redesign. In fact we’ve researched thoroughly and thought deeply. Taking this approach, we think we’ve created a really beautiful and usable design that is so much clearer and more satisfying to use.

While the new look might be a little jarring at first, we think you’ll come to appreciate it quickly.

We’re just so happy after so much work to finally share it with you!


This is what we are aiming to achieve with the new Planning Alerts:

  • Simple: Clean and uncluttered interface
  • Extremely legible: Easy-to-read text across all devices.
  • Accessible – big type and high contrast
  • Welcoming
  • Human: Small touches of whimsy to remember we’re people
  • Not be government-y: Distinct and friendly design
  • Trustworthy: Improved transparency, without compromising ease
  • Respectful space: Inviting clear respectful interactions
  • Show a little more of the process under the hood so that people can interact with the service more productively, know what it’s doing, and be more confident that it is doing what you expect it to do.

Simple, Extremely legible and Accessible

The new site has big legible type across all screen sizes with good colour contrast everywhere. While no substitute for manual accessibility testing, we’ve also added automated accessibility testing as part of our test suite. This allows us to make changes more confidently, knowing that we’re less likely to inadvertently reintroduce some old accessibility problem.


Human Touches and Respectful Space

You’ll find illustrations of people as a subtle reminder that you’re in good company. There’s a symphony of other people here. People around Australia, people in our community and people working behind the scenes in government bureaucracy all read our comments and connect with our words. And of course we, the people running the service do too. This is a service for everyone, so when you make a comment, remember not everyone looks or thinks the same way as you. We all deserve to be part of a respectful dialogue helping to shape the built environment around us all.


Welcoming and Not Government-y

Colours! They’re a little idiosyncratic and we love them. They’re warm but not overbearing, cute but not cutesy, diverse but coherent… we think it would be hard for you to think this was a government service.


Transparent and Trustworthy

Trust us, we have made some changes aimed at increasing transparency and trustworthiness. For example, when you write a comment, from now on you’ll preview the full email before it’s sent to council instead of Planning Alerts magically delivering it to the planning authority. This is so you can double check what you’ve written and better understand any privacy implications. Importantly, you will also see more clearly exactly what will be sent to council and what’s publicly displayed on the site.

One of the many things we learned in the research is that we’ve embedded a lot of our values and ethics into decisions we’ve made that have very specific outcomes on how we present planning information. However we haven’t always spelled these out directly. We are also working to ensure these values are clearly expressed not just in our hearts, they’re also explained in our web pages and communications. One key example is our commitment to protecting your privacy.

Also, we realise the mechanics of the service haven’t always been clear to new users. So we’ve added a “How it works” section on the home page.


This is Just the Beginning

This redesign is just the start of a new stage for Planning Alerts.

While we think this is a more polished redesign than we have ever launched before, inevitably there will be rough edges or things that just aren’t working as they should. So, please, if you find something that seems weird or wrong or could be better, please share your thoughts with us. We really depend on you sharing your experience to help us focus on better meeting your needs.

Share your thoughts with us


Posted in Announcement, PlanningAlerts.org.au | 2 Responses

The Australian Taxation Office (ATO) has revoked OpenAustralia Foundation’s Deductible Gift Recipient (DGR) endorsement

On Thursday November 16 2023, the OpenAustralia Foundation received the following letter from the Australian Taxation Office.

In it they said “Based on the information we examined during the review, we have determined that The Foundation is not operating as a public library to be entitled for endorsement as a DGR under items 1 & 4 of the table in section 30-15 of the ITAA 1997.”

We do not agree with their decision.

We are talking to our lawyers about our next steps. We will have more to say soon.

Additionally they sent this one today.

Kat and Matthew

Posted in Announcement, OpenAustralia Foundation | Tagged , , , , | 2 Responses

They Vote For You – There’s something wrong with Peter Dutton’s voting record!

A screenshot of an erroneous They Vote For You voting record, showing Peter Dutton as a supporter of a constitutionally enshrined First Nations Voice in parliament. The screenshot has a large error symbol overlaid on it.

You may have seen this screenshot showing Opposition Leader Peter Dutton as a supporter of a constitutionally enshrined First Nations Voice in parliament. If this looks wrong to you, it’s because it is. Several votes on the Constitution Alteration (Aboriginal and Torres Strait Islander Voice) 2023 were erroneously connected to this policy, when they were actually votes on whether to have a referendum on the Voice (not whether to have the Voice at all).

Once it was pointed out that Mr Dutton was now showing up as a Voice supporter, the issue was corrected. First, the existing policy was put into draft mode and its text changed to highlight the inaccuracy. This made sure that subscribers to the policy would get a notification that there was a problem.

Next, two new policies on the subject were created that better reflected the votes we have available: one specifically on whether to have a referendum on the Voice and one more generally on implementing the Uluru Statement from the Heart.

Then we made a post on our social media about the problem and what we did to fix it (see Facebook, Instagram and Mastodon).

This isn’t the first time that a mistaken policy connection has happened on They Vote For You. See, for example, this blog post from 2015 on an issue with Andrew Wilkie’s voting record.

Since the policy connections are done manually, it’s inevitable that some errors will occur. So if you do see something that doesn’t look right – such as Mr Dutton showing up as a supporter to a policy he is actively campaigning against – then please let us know so we can get it sorted as soon as possible.

Posted in They Vote For You | Tagged | Leave a comment

Journey to Improving Planning Alerts: A Service Designer’s Perspective – Part 3 Toolkit for Service Improvement

Guest post by Service Designer and Researcher, Sabina Popin

This blog post is the grand finale in our three-part series on the recent Planning Alerts Service Improvement project. I kicked off the series with a bird’s eye view of the project, and the sequel dove deeper into the approach, research, insights, opportunities, and concepts. Now, let’s shift the spotlight onto the nitty-gritty – the outputs of our work and the fate of these insights.

Oh snap we created Service Principles!

In the process of identifying key opportunities for service improvement we generated How Might We statements. A method of creative questioning that can then be used as a prompts for generating ideas. 

While we did indeed use them to help us generate our short term concepts for improving the service, to our surprise these statements also echoed some profound truths about how the service should operate. At first I wasn’t entirely convinced on what to do with these statements, conscious of not creating more artefacts that might add noise and confusion for the team. By being open about this with the team it became clear that some much needed Service Principles were taking shape. A set of aspirational instructions that would support design, development, onboarding of new staff and collaborators, and decision making that aligns with the core intent of the Planning Alerts Service. 

We dived headfirst into the details of these principles, which shone light on what the Planning Alerts team implicitly believed about the service. Through this process we found that some earlier principles crafted to support Development Design now had a new home – right next to Service Design principles. Three sets of principles emerged:

  • Experience principles: How we want the service to feel. 
  • Purpose principles:  Core values and intentions of the service.
  • Delivery principles: The “how” of our service delivery.

The emergence of the Service improvement toolkit

During the research synthesis process I plotted the user needs onto the different stages of the user journey to understand people’s needs in the context of where they occur when using the Planning Alerts service. This was the moment a new kind of map formed – a map of service improvement opportunities. We found gold in this all-in-one map that encompasses insights about individual, organisational, and planning authority needs, opportunities for improvement, and ideas for those opportunities. Usually, these elements would need different maps.

All-in-one Service Experience (Improvements) Map

A couple of factors led us to believe that such an all-in-one map was the right thing. 

At Planning Alerts the key decision makers are also the same people who are implementing the decisions. They need a tool that’s a Swiss Army Knife – determining strategic focus, stirring up solution discussions, and being a nest for future feedback.

Designing this jack-of-all-trades artefact posed quite a riddle for me. With so many functions, it risked becoming an overwhelming info dump, leaving it gathering dust. I shuddered at the thought of our hard work and precious insights being lost in the labyrinth of a poorly designed artefact.

This called for some good old fashioned back to basics information design hierarchy, but most importantly it called for ongoing collaboration with the team to ensure they knew the artefacts back to front, and developed a sense of ownership that would allow them to continue using it. 

Triaging new feedback

During the map creation process, we realised that the team needed a way to differentiate between issues that could be addressed right away and those that needed further analysis. As valuable as Github is for tracking immediate tasks, it’s not as well-suited for issues requiring extended discussion and evaluation.

And there’s no lack of feedback — with the new user account features, we have more comments, more blog posts, more emails, and likely even more feedback as we implement new changes. So, we were faced with the question: What should we do with these new issues?

We needed a way for the team to to organise these incoming pieces of feedback, opportunities and ideas, and to know how to use them to build on the needs that have already been uncovered and are housed in the Service Experience Map. This led us to experimenting with a triaging process for any new issues or feedback that cannot be actioned immediately, so they don’t get lost. 

Ok these sound great, but how do we make this work?

“We need a guide, some ‘how-to’ instructions!” was the consensus. And so, the toolkit concept was born. It will be an experiment, of course; we can’t guarantee it will be as beneficial as we hope. This led to an agreement on continued engagement: I’d work with the team once a week for at least the next three months, helping them iterate and integrate the toolkit into their strategic and daily decision-making processes.

Where to from here?

Short term improvements and Github

First up, we prioritised some concepts that could be implemented right away to address recurring issues. These small changes have the potential for a big impact. The Planning Alerts team already uses Github, so I logged these ready-to-go issues there. Integrating this process was extremely valuable; it helped me understand the level of detail required for a concept to be actionable and more importantly, how to weave user insights into the final design and features of a concept. I was watching the insights come alive.

Next, the idea is for the team to act on prioritised concepts, determining what they want to track and measure before implementing. This will allow them to evaluate new features against these metrics and make improvements based on feedback.

Ongoing strategic decision making

Having the support of myself or another service designer on an ongoing basis will help ensure the artefacts in the Planning Alerts improvement toolkit become a regular part of strategic conversations and decisions about the service.

Over time, they’ll need to continue working through the User Experience (Improvements) Map, making improvements at both the incremental and the bigger picture level. Part of my ongoing engagement will involve helping them cultivate a habit of using the Miro board and artefacts for strategic conversations. Additionally, they’ll need to start incorporating medium- and long-term improvements into the toolkit to make it an integral part of these strategic discussions.

A service designer on the team ongoing will help them refine their service principles, integrate artefacts into the onboarding process for new staff and collaborators, conduct ongoing research with users to understand evolving needs, and work with council and planning authority staff to better understand their needs and potential improvement opportunities.

Unique Challenges Require Unique Solutions

The chance to continue working alongside the team beyond the project and become true partners in the work is the greatest highlight for me. While it’s a service designer’s nightmare to see valuable insights go unused at the end of a project, the dream is to be there from the start — from research, strategy, and ideation through to implementation and beyond.

So, faced with unique challenges, the Planning Alerts team had the ability to be flexible and a willingness to try new approaches. This spirit of improvisation and adaptability were key assets in the success of this project, serving as a powerful reminder that no team is too small to make a substantial impact. Size and resources are only part of the equation – collaboration, creative thinking, and resilience play a crucial role in the success of any service design project.

Posted in Planning, PlanningAlerts.org.au | Tagged , , , , , , , | Leave a comment

Journey to Improving Planning Alerts: A Service Designer’s Perspective – Part 2 Research, Testing and Insights

Guest post by Service Designer and Researcher, Sabina Popin

This post is part of a three part series about the recent project on Planning Alerts Service Improvement. In the previous post I outlined the entirety of the project in brief. In this post you get to read in more depth about the epic journey we’ve been on to uncover opportunities for improvement, the insights and learnings we gained along the way and finally what concepts motivated the changes you will eventually see implemented on Planning Alerts

From Email Inbox Research Insights to Concepts

The research for this project was done in two phases: email inbox data analysis followed by in-depth interviews with the people that use the service. An earlier two month inbox study, generated hypotheses and initial insights about people’s needs. For this project, I extended the exploration to consider a little over a year’s worth of inbox emails, that’s thousands of emails (February to December 2022). This amount of data, coupled with the length of time we’ve analyzed, was more than enough to give us a better understanding of how many people are affected by similar issues. It also helped in identifying the biggest pain points and moments of delight for both staff and the people using the service. From the analysis of this data, categories of needs emerged, which I then mapped along the Planning Alerts service journey stages, to better understand the context in which they were occurring. 

The biggest pain points for support staff and people interacting with the service

The Planning Alerts team works hard to be supportive and human in their responses to everyone who writes to them. Canned responses are used for some email types, which helps, but they often require adjusting to fit each enquiry. However, with only one person as support staff servicing the whole of Australia, the individual handling and sorting of emails, and the time and thought needed to respond to them means some emails get missed, or replied to late. On top of this, working through sometimes emotive or complex community issues raised in comment reports can be emotionally taxing and requires skill, focus, empathy and time. 

As for people using the service, I observed that the issues causing the most frustration were when either alerts or the service didn’t work as they expected, for example missed alerts, or when they found information to be wrong or missing. In addition, many were under the impression that when they contacted Planning Alerts they were getting in touch with their local planning authority.

Uncovering opportunities

Once I had some key insights to share with the team it was time to roll our sleeves up and really kick-off the magic of the design process. I worked closely with the Planning Alerts team in a series of regular collaborative working sessions. These sessions were fast paced, messy and iterative. We would simultaneously play back research findings, gain further insights, and generate ideas to tackle the issues at hand. It was a sizable departure from the typical service design approach where sessions are more structured and separated into specific activities. It was also high energy seeing ideas form straight out of insights, it gave us the necessary momentum to keep chipping away. This unique all-in-one approach was made possible as the people delivering and implementing the service were also the decision-makers. They could dive into the nitty-gritty details while also maintaining a high-level perspective.  

Opportunity questions

Through our collaborative sessions we identified several key opportunity questions to help us generate ideas for improving the user experience on Planning Alerts. These questions guided us towards imagining solutions to common concerns and challenges people face. They gave us the necessary structure to sketch out ideas and form concepts:

  • How might we support people to express their needs in a way that is relevant and focused, so that they are taken up by council?
  • How might we support people to self-serve for known issues so that they feel empowered and don’t need to email in as often?
  • How might we ensure people are never left wondering about the reliability of the service?
  • How might we encourage people to be respectful of each other, so that there is opportunity for cooperation?

How do we prioritise what improvements to make?

We were fast approaching in-depth interviews with people who use the Planning Alerts service. At the same time we were grappling with a puzzle: we had a plethora of ideas, ranging from small UX tweaks to broader, long-term strategies like partnering with authorities to improve comment reception. We wanted to cover this broad range of ideas, and we also needed to prioritise actionable tasks for the coming year or quarter. We chose a few key opportunities and ideas to prototype based on the key pain points from the email inbox study and support staff experience to put in front of people. This would help us learn more about their needs and test our improvement hypotheses. This pivotal moment allowed us to focus on short-term goals while keeping an eye on the bigger picture.  

I then combined and refined the ideas that came out of this idea generation activity into concepts, sketching them out digitally to make them approachable and ready to test.

  • Understanding how the service works: A step-by-step How it Works section for the home page 
  • Getting help on the site: A dedicated help and support space on the Planning Alerts site that contains helpful articles, FAQs and a contact-us form
  • Getting help by contacting us: A contact-us form with links to helpful articles and guidance on how to ask for help
  • Getting help by replying to an alert: An automated email that would get sent to people who accidentally reply to an email alert with help articles and link to support pages
  • Proactive information when not receiving alerts: An email notification that would get sent to people who have not received alerts in some time to let them know why that might be the case
  • Commenting clarity: An improved comments section that draws attention to the important parts of a comment submission. 
  • Comment guidelines: Guidelines to help people write comments that have a better chance of being accepted by planning authorities
  • Comment preview: A built in comment preview function to encourage people to confirm what they want to say and to understand how the comment will appear on Planning Alerts and what will be sent to the planning authority
  • Report comment categories: A reporting feature that adds a few more categories to comment reporting to support people to make a conscious choice on why they are reporting the comment. 

While the inbox gave some great insights, I still had a lot of questions. The inbox was useful in creating a base for what might be happening, but I was still making a bunch of assumptions. I was really looking forward to talking to people to get their feedback on our concepts and to better understand what was really going on for them. Either busting those assumptions or validating them. 

All in one, research and testing interviews: a winning combo

First, a note on research participant recruitment

Next up, talking to real people! Early on I worked with the Planning Alerts team to identify the different groups of people who use the service to speak with. This included people who have alerts, who have registered for new email accounts (which the Team were concurrently rolling out), people who have made comments on Development Applications in their area, as well as people who work for planning authorities in Local and State Governments. We wanted to hear more about them and their needs to bring life and nuance to our understanding of their experiences. Yet, we also didn’t have long to bring the pieces together as we were heading into the long summer holiday season. 

Recruiting participants for research in larger organisations is usually done through a research recruitment agency from a list of people who have purposely signed up to participate in all kinds of research. When research recruitment is done internally it often takes a lot of time because it can be difficult to find people willing to participate even for a monetary incentive. I was shocked or should I say pleasantly surprised at how willing people who use the Planning Alerts service were to donate their time to participate in the research. We even heard a wonderful story where community residents shared that the Planning Alerts research was happening with one of their elected councillors and she signed up to participate in the research! WOW! This showed me how engaged the Planning Alerts community is, how much people genuinely care about what’s happening in their communities and that there is immense opportunity for planning authorities to work together with Planning Alerts for the benefit of communities Australia wide. Insert ‘smiling crying emoji’.

Gathering insights from people who use Planning Alerts

With the concepts and my trusty interview guide ready I was all set for research/ testing. I engaged with a total of 15 people, in 60 min 1-1 sessions. The people were a mix of people who’ve used Planning Alerts across residents, community groups and councils/ planning authorities. 

The first 30 min of the session was more of an interview format to understand their current state experience of Planning Alerts, their needs and aspirations when it comes to engaging with the planning process. I structured the interviews in this way to ensure we could learn in general about their experience as well as to get their views on our hypothesis on what could improve the experience. 

The second part of the interview focused on testing the ideas using the sketches and wireframes we created – this meant getting feedback to help understand what in their view works and what doesn’t.

We learned a lot from people about their experiences engaging with the planning process and with Planning Alerts. These higher level insights support our understanding of the complexities of the entire planning ecosystem and where Planning Alerts can play a role in supporting community engagement in the planning process.  You can read these insights below.

As for more immediate learnings that can help inform what short term changes we can make to improve the Planning Alerts service, well we learnt a lot about that too. The participants’ feedback helped to validate that we were on the right track with the kinds of improvements we wanted to make. Namely supporting a smoother and more transparent commenting experience; giving clarity around who Planning Alerts is; transparency around the service coverage and interruptions; supporting self-service with known issues and freeing up time for support staff to focus on responses that need more attention. 

Participant comments and feedback helped us to refine and improve the concepts further. As you read this you may have noticed some of these changes rolled out already!

Lessons Learned

Part of the prioritisation was already done, we knew that the concepts we took out represented the key things to solve in the short term. But before we could be ready for implementation we needed to iterate the concepts based on feedback, flesh them out so that they had enough detail to be designed and developed. And lastly, yes you guessed it ANOTHER level of more granular prioritisation. What goes off the rank first, like now, this quarter? 

I found this immensely satisfying; I so rarely get to be involved from the discovery and research all the way through to writing a detailed specification for design and development. The end of the final concept prioritisation truly felt like a moment of celebration for everyone. 

But that wasn’t the end

With the concepts now fleshed out and prioritised I now needed to help the team get all this useful work into formats and artefacts that they could actually use ongoing. I wasn’t about to let my service designer nightmare come true and let all the precious insights go to waste! We began to think of this suite of insights, needs, opportunities and concepts as a toolkit. 

And toolkits need clear instructions. They need to be well organised and approachable, they should neatly house all you need for the task at hand. 

By treating the suite of insights, needs, opportunities and concepts as a toolkit, we ensured that the Planning Alerts team could continue to put these valuable insights to work, and maintain a human approach to their service.

What did we actually learn about peoples experience with Planning Alerts?

Planning Alerts keeps communities informed

Staying informed is a key reason that Planning Alerts is valued by residents, particularly as planning authorities are not required to inform directly on all applications. With Planning Alerts, community groups and residents are able to form their own understanding of the shape of their suburb.

For the council and planning authorities that participated in the research, Planning Alerts play an important role in keeping their community informed when they are not able to or when their systems are harder to navigate. This can however cause additional work when the planning authority is not required to advertise but the applications are still picked up by Planning Alerts.

Easier to engage with Planning Alerts

Oftentimes Planning Alerts is seen as easier to engage with than council and planning authority portals. People appreciate being able to easily find a DA, comment on it, see other people’s feedback and be kept up to date on planning information in their area of interest.

Planning Alerts also makes it easy to share applications with the community via email and Facebook to discuss and rally together.

Confusion about who Planning Alerts is and what it stands for

Some people who use Planning Alerts think that it’s run by a local planning authority. This leads them to write to Planning Alerts with the assumption that Planning Alerts can provide them with information only a planning authority can, or to express frustration.

People want to be able to hold planning authorities accountable – thinking that Planning Alerts is run by the government and doesn’t allow them to do that. There is a desire for Planning Alerts to bring clarity about who they are and what they stand for and emphasise they are not connected to Planning authorities.

Research participants have expressed that emphasising the ethical pro-democracy, independent, not-for-profit nature of Planning Alerts is important for building trust and for understanding the gaps of each planning authority.

Biggest pain points relate to perception of service reliability

People appear to get most frustrated with alerts not working, showing wrong locations/images or not being able to find additional information and documentation to fully understand the application.

When privacy is in question, for example with comments being posted publicly this can be a source of frustration for people as they want immediate action and resolution.

People feel they need to understand the planning system to engage with it

People find the planning process and development applications and their accompanying documents difficult to understand. The more people engage with the process the more they learn. While some do their own research others are relying on more knowledgeable members of their community. Councils and planning authorities feel they are doing their best to support residents to understand the planning process.

Residents and planning authorities agree that developing an understanding of the planning process is important to be able to engage with it. They acknowledge the importance of engaging directly with planning authorities. Experienced residents have learnt through experience that the next steps to making a submission is to form relationships with council planning staff and write to elected councillors and state representatives, ministers and shadow ministers.

Finding information on an application can be challenging

People looking for more information about an application have found the experience to be convoluted and time consuming. When they go to a council or planning authority website they often have to copy the DA number as some council links go to their homepage rather than to an application directly. Once they get there, the documents are hard to understand for anyone who is not well versed in the planning process.

People with more experience with the planning process feel that there is not enough information on Planning Alerts to understand an application fully, which leads some people to make comments based on little context.

Discussions and comments on applications

People appreciate the presence of other people’s comments. It allows them to get more information, to understand the experience of those in close proximity to the development. One community group that participated in the research is even using it to recruit new members.

People are actively sharing Planning Alerts links on Facebook, where there are discussions and feedback is shared. These discussions may not always make their way back to Planning Alerts nor to the relevant planning authorities. Some community groups however suggest and create guidelines for members to create their own submissions to planning authorities.

While other people’s comments are largely helpful,  inflammatory and/or abusive comments do not go unnoticed. For some, seeing commenters ridicule or argue with others deters them from commenting, for others it raises concerns about their own and others’ privacy.

Comments based on little information may not be valid*

There is a small amount of text on the Planning Alerts email alert and application page to describe the nature of the application. Many people do not go through the process of fully understanding the development application before they comment. They are making comments based solely off what is written on the Planning Alerts page. This can lead to misunderstanding of the nature of the application and an increased amount of comments that planning authorities may consider not valid or irrelevant to the application at hand. These comments ultimately have no set process on the planning authority side and may not get responded to.

People (both Planning authorities and residents) have said that it’s important for the community to have enough of an understanding of the application to be able to create a valid response without having to pour over the minutiae of all the documentation.

People sometimes make comments that they acknowledge are not a direct objection or approval, sometimes they want to provide important context for the community, or have questions and want more information. In addition planning authorities have noticed that some comments are not related to planning at all eg. commenting on the nature of the person applying.

Engaging directly with council or planning authority

Residents who have experience with the planning process are tactical with their commenting. Only commenting on what is really important to them. In addition to commenting through Planning Alerts, they will make a submission directly though the respective planning authority and contact them directly as well.

They cited that they have had more luck in getting a response when going directly to the planning authority.

Getting to know the Planning staff at council is helpful for getting information or forming understanding about the development process. It provides a direct contact to turn to about issues and a better chance to make change.

People wonder what happens to comments and why they don’t get responded to

People want to understand what happens to the comments once they are sent to the planning authority. There’s a strong sentiment that planning authorities and councils rarely respond or get back to people. This discourages people from commenting.

Some participants have suggested that council could respond directly to comments publicly to express why something has been resolved the way it has. To continue to feel motivated to engage and take action, people need to see a response from their local planning authority or state member. Research participants have shared that it seems to them the best way to get a response is to engage with councillors and state representatives directly.

Volume and validity of comments from Planning Alerts are challenging for Planning authorities

There is a much higher volume of comments that comes in from Planning Alerts compared to Planning authorities’ own channels. It’s challenging for planning authorities to distinguish between valid responses and non-valid ‘chatter’.

Ensuring comments are going to the relevant people at planning authorities and are being submitted in the appropriate format is essential for them to be responded to and counted as valid. For a comment/submission to be counted Planning authorities have a list of requirements. Depending on the authority these may include: within the submission timeframe, full name and address, specifying approve/disapprove, reason for approve/disapprove and how it impacts the resident, relevance to planning requirements ie. built form, landscaping quality, and is a unique submission not copy paste, and mailed directly to planning authority or form submitted on their site.

How Planning authorities respond to comments

Planning authorities have different rules for responding to feedback depending on the legislation. Most are only required to respond to valid feedback, non-valid feedback is recorded but may be lost and never responded to. Residents are not always aware of these rules. For some authorities different volumes of feedback trigger different processes eg. more feedback, gets escalated.

Due to organisational limitations planning staff are not always well resourced. This means that comments that are not valid or relevant to the submission are recorded but not always responded to. They don’t always have the capacity to send them to where they need to go and some inevitably fall through the cracks.

There are also developments that don’t require consultation. When they receive comments from Planning Alerts for these applications, it makes the back end process labour intensive as they don’t have a process in place for dealing with unsolicited comments.

There are different rules around advertising in different jurisdictions. In some jurisdictions where they are not required to advertise for certain developments, yet they still get responses from community through Planning Alerts. This adds extra pressure on them even when they’re not able to do something about it.

What’s next you ask?

Well for that you’ll have to tune into the next blog post where I’ll go into detail about the artefacts we created, why we created them and how we intend for the team to use them ongoing. Also I suspect you’re wondering what happens to my relationship with the Planning Alerts, well we thought of something interesting. Wait and see!

Posted in Planning, PlanningAlerts.org.au | Tagged , , , , | Leave a comment

Journey to Improving Planning Alerts: A Service Designer’s Perspective

Guest post by Service Designer and Researcher, Sabina Popin

Welcome to the journey of improving Planning Alerts!

The OpenAustralia Foundation is a tiny organisation that gets a lot of email! Thousands of emails a year are directly in relation to the Planning Alerts service, people who use the service reaching out asking for all kinds of help. Currently they respond to each email individually, and as the service grows this is becoming harder and harder to do. As a team, they are always on the lookout for ways to improve their services in the most cost-effective way possible. They focus on making the smallest changes that have the most significant impact. However, in the case of Planning Alerts they struggled to differentiate between the problems people were facing when using the service and the issues they themselves wanted to fix. They wanted a process that provided a more systematic approach rather than relying on gut feelings alone, and that’s where my service design skills came in handy. They invited me to join the team for a few months to put my skills to good use in helping them find opportunities to improve the service. Ultimately to improve the experience for people who use the service and enable staff to put their effort in where it matters most to people so they can have a say on issues that matter most to them. 

In this three-part series, I will be sharing the teams and my learnings from this project. In this post I’ll walk you through a summary of the entire project. In the second post, you’ll see how I conducted research and testing, and what insights emerged. Finally, the third piece dives into the project’s output and the team’s next steps. Put simply, you’ll get the low down on the entire process behind some of the changes you will see around the Planning Alerts service. 

A Beginner’s Mindset in Service Design

While Planning Alerts had never used service design at this scale before, for me this felt like a comforting familiar flavour to how I’ve made my bread and butter for the past 10 years as a service and strategic designer. Except for a couple of small details… I knew zero, zip, zilch, nada about planning. I had also never worked within the civic tech space… in a charity… run by a teeny tiny team of two! Eeeep!

Sure, uncertainty, anxiety, and self-doubt may seem like red flags in other professions, but for us, they’re key ingredients for good design. In my work I’m often diving head first into a new world for each project. And, for a service designer, having a beginner’s mindset is a must-have. By approaching each project with an open mind, we can connect with people on a deeper level, eliminating assumptions and biases along the way. Most importantly, by approaching with fresh eyes, we’re primed to spot nuances and details that even the savviest subject matter expert might overlook.

Building on the work of others

My 3-month adventure with the Planning Alerts team was packed with insights gleaned over 6-8 weeks worth of work. As a newcomer to this space, I was thankful for the earlier work Jo Hill did with the team to understand what people were getting in touch about and why. This provided an excellent starting point. Jo Hill’s write-up and service blueprint provided a great overview of the service, while the categories and insights gleaned were an invaluable starting point for mapping people’s needs. The results of that work were eye-opening for the team and provided categories and initial insights to guide the Planning Alerts team, along with a set of suggestions on where they might focus next. Which set the stage nicely for my work.

Diving into the Deep End: Unearthing Insights, Ideas and Opportunities

I dug into about a year’s worth (Dec – Feb 2022) of Planning Alerts inbox emails. That’s thousands of emails from people. I then built on the previously created email categories and data analysis using a virtual wall (Miro board) built in white boarding tool Miro. With this mapping exercise, I identified key insights and opportunities for improvement. I met with the team regularly, sharing insights from the inbox research, and together we simultaneously discussed the insights and generated ideas and opportunities over zoom and screen sharing. This was made possible as the Planning Alerts team of two are both responsible for decision making and implementation. This means they’re able to hold both a strategic and practical implementation view at the same time, and this is in fact this is how they regularly work. So while I normally wouldn’t do things in this all-in-one way, adapting to their natural way of working meant that multiple pieces coming together in parallel felt exciting and energising and gave us all momentum.

Image of white boarding tool Miro with email categories.

But the inbox study alone wasn’t enough, we knew we had to bust or validate assumptions we had made. So to ensure we didn’t get stuck answering the wrong problems, or miss some giant elephant in the room, we interviewed people who use the service. We asked them about their experiences using Planning Alerts and the planning process in their area more broadly. In addition we created prototypes of intended changes to the service and brought these along to test out. 

After synthesising and analysing interview data, came another round of team sharing, reviewing, prioritising, and organising ideas into buckets of what is desirable, feasible, and viable to create in the short, medium, and long term. Together we unearthed some of this project’s treasures during epic zoom sessions including a series of vital service principles.

Creating Artefacts to Support Decision-Making and Action

The next step was to honour the treasure we found, by creating outputs of different kinds (artefacts) that the Planning Alerts team could use to support their decisions to make practical improvements to the service. These included a set of current state key insights and a Service Improvement Toolkit. These comprehensively document everything we learned about what people who use Planning Alerts need, their pain points, and outlining service improvement opportunities in a Miro board. 

We agreed that artefacts I created would need to be super practical. The Planning Alerts team wanted to use these both to support their decisions on how to improve that service and keep them on track and accountable. I turned the quick wins and high priority opportunities for improvement into Github issues so that they’re immediately part of the usual working environment, ready for action.

In the next post, I will share more detail about the research and testing, the juicy insights that emerged, the exciting new features that we prioritised to implement. And trust me you do want to see how this delicious sausage is made!

Posted in PlanningAlerts.org.au, Uncategorized | Tagged , , , , | Leave a comment