Summary of Findings:
It does not appear that many cities have gone beyond a traditional customer service scorecard. However, they are using a more technologically heavy approach to get greater survey response rates to the traditional survey on satisfaction and dissatisfaction with services. At the same time, they are establishing a knowledge base about their customers and their locations, and using integrated technology to understand which services are being utilized most, and by who. By harvesting a closer relationship with customers, they are able to understand how well customers needs are being met. Further, some other options besides the standard Customer Satisfaction (CSAT) is the Net Promoter Score (NPS) and Customer Effort Score (CES). The only city whom we heard from directly was the City of Gilbert, AZ, who has been considering the Net Promoter Score, and has also been considering a unified software system, which is essentially the CRM (Customer Relationship Management System). The City of San Antonio and the City of Philadelphia have a CRM system, and this helps make data collection and analysis easier for management to use for their purposes. Using a survey with a unified software system, as these cities do, will help achieve new standards in customer service.
The research of interest refers to city’s assessment of customer satisfaction, as opposed to citizen/resident satisfaction survey, and general opinion surveys. “Customers are distinguished from general residents who may benefit indirectly from the delivery of county services. Residents, such as tax payers or voters, have a clear interest in the effective and efficient provision of county government services…. In fact, many organizations need to balance the needs of their customers with the needs of the public interest or the community at large” (King County Executive Office, 2010) .
Cities that are conducting customer service surveys and doing it well tend to have integrative technologies, i.e 311 systems, CRM systems, and mobile applications that allow easy and accessible interactions with citizens. They provide a greater level of citizen engagement and involvement, as well as valuable information to cities through better reporting and service follow through. These systems allow residents to submit a request, which is sent to the appropriate department, and then when the request is complete the resident is notified and met with a survey. By using integrative systems, all requests are stored centrally, making them easy to track, access, and analyze. This includes the City of San Antonio, TX, City of Philadelphia, PA, and City of Gilbert, AZ. Surveys may range between interaction with 311 call centers, and focus on how well the customer’s needs were met, how quickly the customer’s needs were met, what request was made, zipcode and address of customer, other demographics, and mostly include scorecard questions ranking interactions. Most surveys include at least one open ended response question.
Some cities measure customer satisfaction for different services through individual surveys at the department level, particularly for public works, utilities, courts, fire and police safety. Often, cities tie dashboards with the results. Sometimes, these results become tied to key performance measures for the city, and annual reports on citizen satisfaction. Because citizen satisfaction and community surveying was not the subject of interest, these were not included in this report. However, there does appear to be significant overlap in cities asking for customer service feedback via citizen satisfaction surveys. Therefore, sometimes cities appear to be providing customer service satisfaction surveys, but it is more likely to focus on citizen satisfaction with services. Some of these cities with individual surveys, performance measures, and dashboards include the City of Cincinnati, Ohio; AZ; City of Scottsdale, AZ.
Some challenges to informing customer services to help improve services include the type of data collected and how much data can be collected. Informative data tends to come from open ended responses, but this can be hard to manage and time consuming to sort through. The data that is easy to report comes from satisfaction levels with service, which don’t provide a lot of context beyond illustrating if service performance needs to be improved or not, and less context into how a service may be improved. This is where focus groups and first line employees may be able to help collect necessary feedback The other challenge is getting customers to participate in surveys, which using systems like 311,CRM, and Mobile applications improve response rates and quality of data.
Overall, it appears that in general, most cities and their customer service surveys don’t provide substantial feedback beyond whether a citizen was satisfied or dissatisfied with the request, how long the request took, whether their needs were met, and other drivers of satisfaction. Significant customer service performance measures can be tailored to what cities hope to inform and how to improve, but there are no commonplace customer service performance measures that can be identified. There is a great deal of research available on creating customer service surveys for government organizations.
Cities doing it well:
City of San Antonio, TX
The City of San Antonio uses their centralized CRM 311 system to collect feedback through surveys after the completion of a request. They also have a mobile app that allows easy customer access and engagement, which launched in 2015. San Antonio measures customer satisfaction every other year through a resident survey. In 2018, the Customer-Service-311 Office landed in the top five departments for citizen satisfaction and came in second for most improved ratings. That same year, the city also held user design-work sessions inviting residents to share their experiences with the mobile app, which gave the city a better idea of what customers were looking for, including education about city processes. This feedback was instrumental for the city to use in its upgrades meet customer needs and satisfaction.
They credit their centralized system as fundamental in creating the city’s desired customer service experience for resident, on top of data collected, including volume of contacts received from residents, and percentage of service level agreements reached. They don’t collect a great deal of demographic information through the system, but the city does receive feedback through online surveys that are returned when a request is submitted. The city creates heat maps and used 600 responses from the survey and zip codes to determine where requests were coming from was able to collect about one year of data from surveys, enabling San Antonio to create a “heat map” based on 600 responses received from the survey. By analyzing zip codes to determine what kind of requests were coming from different locations, they were able to compare that to info with requests received, and determine if the level of citizen feedback was similar to where the complaints were being reported (Fleming and Stallcup, 2019). They say “determining the level of customer service is the first step. Once that experience is defined and expectations set, the information collected can be used as a base for designing and/or enhancing a system” (Fleming and Stallcup, 2019.)
What method did you use to contact the 311 Customer Service Office?
Please rate the speed in which your call was answered.
How easy was it to use 311?
How likely are you to use 311 again?
Did you use 311 for information or to request a city service?
Please rate your overall experience with the 311 Customer Service Office.
Please provide additional feedback regarding the 311 Customer Service Office.
City of Centennial
CSAT, NPS, and CES: 3 Easy Ways to Measure Customer Experience
Interview with Kelly J. Ohaver, Customer Experience Manager from the City of Centennial, Colorado. This introductory blog spells three customer experience metrics that Kelly shared.
- Customer Satisfaction (CSAT)
- Net Promoter Score (NPS)
- Customer Effort Score (CES)
City of Cincinnati, OH
The city of Cincinnati presents the results of its customer service satisfaction using visual maps and data to track service requests, satisfaction, and even show visually the number and type of customer service requests received that remain open. It collects on data including date received, department, date request cleared, longitude and latitude, zipcode, address, how the service was requested/handled (call center or mobile app), and if the customer was satisfied or not.
Citizen Service Requests (CSR) give Cincinnati residents the opportunity to submit service request for concerns like potholes, tall grass and missed trash pick-up. Using the Fix It Cincy! Mobile App, the customer service request online portal and the hotline (513-591-6000), citizen service requests are routed directly to City departments, including Transportation & Engineering, Buildings & Inspections, Health and Public Services. Once the department's work on the service request ticket is completed and the request is marked as "closed," customers receive an email notification that the work has been completed, followed by a link to an optional customer service feedback survey.
The data visualization shows customer satisfaction feedback, by location, service request type, and department work group. Open CSR tickets are visualized HERE.
City of Philadelphia, PA
Philadelphia emphasizes justifying activities, programs, projects, initiatives, or products through results based processes. This meant abandoning ambiguous performance measurements, forging social partnerships, and using efficient CRM systems to capture data.
The City of Philadelphia implemented a new CRM system to more efficiently collect more accurate city data, and make it easier for both customers and employees to use and process data. The city’s investment in the Philly311 mobile app, likewise, enables citizens to input data directly into the CRM system. Additionally, all customer-service-related data are now in a central location, allowing the city to easily track requests when needed. The system also generates new data based on input trends, such as:
• Volume of contacts made by the public
• Types of contacts made, for example, service vs. information requests
• Dropped or abandoned calls
• Wait time on calls
• Resolution of contact
• First call resolution/transfers
• Location/address-of-service request
• Errors and mistakes in data captured
• Duplication of service requests
Customer Satisfaction Surveys
Local governments can use customer satisfaction surveys to generate customer and performance data. Customer satisfaction surveys provide a quantitative measure of customer service. The feedback from surveys benefits an organization in several ways.
• Feedback on how well you are serving your customers
• Feedback that indicates how happy your customer is with your service
• Information on where you are excelling and how you need to improve
• Insight into customer demographics.
Traditional customer satisfaction surveys measure customer service after the fact:
Did the customer receive what he/she needed?
Did he/she find the experience favorable or not?
In contrast, efforts like a Mystery Shopper Program evaluate the process of delivering the service:
Was the employee smiling, and did he or she have a positive attitude?
Did he/she answer all the questions posed by the customer?
Did he/she take the time to explain processes or procedures?
Likewise, supervisors can monitor phone calls or review e-mails and other correspondence with citizens to assess the agent’s manner and people skills used in working with the citizen. Collecting quantitative data through surveys can also set performance standards for departments and organizations
For example, the executive management team might decide that all service departments need to receive an average score of 4.25 or higher on a fivepoint scale. If a department receives a score of less than that, then a customer service initiative could be undertaken in an attempt to secure a higher average score on the following year’s survey. The results from the customer service initiative can be compared to determine the program’s ROI.
City of Scottsdale, AZ
The city of Scottsdale, Arizona uses customer satisfaction surveys for specific departments, tailoring each survey to the context of the department.
City of Bloomington, IL
The city of Bloomington has satisfaction surveys for individual departments like its public safety, and also uses interactive maps for public works requests. However, there are no significant surveys on customer service through a unified approach, at least that were able to be identified.
City of Gilbert, AZ
Correspondence with Dana Berchman, Chief Digital Officer in GIlbert, AZ
We don’t have a unified software but it’s something that’s been discussed like creating a net promoter score. I’m not aware of other cities who have done this.
On our 311 system, we do have a feedback option where the users score and rate us based on their experience, like Yelp, and we do track that score. Since we do funnel so many of our issues through the 311 app now it’s actually a decent way to track customer satisfaction. As of now, 311 is probably the closest tool we use to track this. Just to give you a sense of how our 311 use has grown, we had 7,757 issues/requests reported through the app in 2018 and in 2019 it was up to 12,000.
We also, as you can imagine, get a great deal of our feedback from customers on social media. And then there’s a form on our website where people can submit feedback and we send those to the appropriate departments when they’re received.
Measuring Customer Satisfaction Improving the experience of King County's customers
Customer Satisfaction Surveys
Types of questions
King County recommends that customer satisfaction surveys should include four types of questions.
- Overall satisfaction
Overall satisfaction measures are derived from questions that ask the customer to rate the service as a whole, such as, “Overall, how satisfied are you with the service provided by this department?” This kind of question is intended to capture all aspects of the customer experience, from the time the customer begins his or her quest to find the product or service, through receipt of the product or service, and into the use of the product or service. The customer experience can include customers’ interactions with service staff throughout the course of this process.
- Drivers of customer satisfaction
If possible, the drivers of customer satisfaction will be identified prior to developing a customer satisfaction survey. Once the key drivers of satisfaction have been identified, effective survey questions can be developed around these drivers.
- Customer characteristics
Departments should gather information that is important to better understand service delivery, customer segments, and overall customer demographics.
Examples of service delivery related information are as follows:
■ How services are accessed (online, in person, over the phone)
■ Day or time services are accessed
In order to understand and analyze your customer’s characteristics, it is recommended you gather basic demographic data (at the end of your survey) on the following background characteristics:
■ Zip code
- Open-ended questions
Recommended questions up to this point have been structured, fixed-alternative, Likerttype questions (e.g., multiple choice). In order to gather true qualitative impressions from customers and allow them to express themselves in their own words, it is recommended that each survey include at least one open-ended question.
Examples of possible open-ended questions are as follows:
■ Is there anything else you would like to tell us?
■ What is the most important thing can we do to improve our service?
Developing and conducting focus group discussions
Departments may want to consider using focus group discussions to supplement customer satisfaction measures. Focus groups bring a small number of people (usually 6-12 customers) together to discuss research questions and generate qualitative information about their feelings and opinions, as well as their reasons for those opinions, attitudes, and beliefs. In the King County customer satisfaction measurement process, focus groups would be most helpful (1) at the beginning of the measurement process, to identify and define drivers of customer satisfaction, and (2) near the end of the measurement process, to help interpret the results of the customer satisfaction surveys.
Analyzing, reporting, and using customer satisfaction information
Agencies should prepare to report both on overall customer satisfaction trends, and on their customers’ opinions about key drivers of satisfaction – the aspects of the customer experience that most influence their overall satisfaction. In addition, agencies should try to relate subjective customer satisfaction results to objective performance metrics.
Key performance measures:
These measures are additional objective data selected by the agency to provide context to the data gathered via surveys. They should relate to the key drivers of satisfaction identified by the agency. These measures are not derived from their customer satisfaction surveys, but come from agency records, documentation, and performance management systems. Some examples of key performance measures include:
■ Objective data on actual reported safety incidents vs. perception of safety derived from surveys ■ Objective data on timeliness vs. perception of timeliness derived from surveys
Once data have been collected and analyzed, departments will compare against performance targets for each measure and show trends when possible.
Measure to Improve Improving public sector performance by using citizen - user satisfaction information
EUPAN – The European Public Administration Network – is an informal network of Directors General responsible for Public Administrations in EU member States and European Commission
This guide acknowledges that general satisfaction surveys offer little data for organizations to make improvements.
What should customer satisfaction measurement do for an organization? In short, customer satisfaction measurement should be viewed as a tool to enhance: Customer focus; An understanding of the key drivers of satisfaction; Strategic alignment; Performance management; and Efficiency and cost saving.
This is an in depth, high quality overview of how to use satisfaction data in your government organization, how to measure satisfaction, options for methods of collection, and analyzing data, and using the data, and following up.
More sources on customer service and satisfaction surveys in government:
Improving Customer Service Through Effective Performance Management
Six Tips On Collecting Meaningful Customer Feedback
Eight Steps to Great Customer Experiences for Government Agencies – An Oracle Whitepaper
The Customer Service Playbook for Government
Customer Experience Toolkit
- Follow policies and requirements
- Digital Metrics Guidance and Best Practices
- Paperwork Reduction Act Fast Track Processpaperwork-reduction-act-44-u-s-c-3501-et-seq/)
- Adopt survey best practices
- Designing a Better Customer Survey – video
- Navigating the Alphabet Soup of Survey Methodologies (PDF) – ClickTools
- Sample CX Question Database (Excel, 87 kb)
- Likert-Type Scale Response Anchors – recommended wording and rating scales for a variety of survey questions
- Sample surveys
- USAJOBS (Excel, 76 kb) – OPM