Return to Reports page
PDF version (320kB)

GIS CAPABILITY MATURITY MODEL PEER EXCHANGE

Nashville, Tennessee
September 20-21, 2016

Host agency:
Tennessee Department of Transportation

logo of the Federal Highway Administration of the U.S. Department of Transportation

Participating peer agencies:
Arizona Department of Transportation
Iowa Department of Transportation
Michigan Department of Transportation
North Carolina Department of Transportation
Ohio Department of Transportation
Oregon Department of Transportation
Tennessee Department of Transportation


TABLE OF CONTENTS


ACKNOWLEDGMENTS

The U.S. Department of Transportation John A. Volpe National Transportation Systems Center (Volpe Center) in Cambridge, Massachusetts, prepared this report for the Federal Highway Administration’s (FHWA) Office of Planning. The Volpe Center project team wishes to thank the participants in the peer exchange, which are listed in Appendix A, for providing their experiences, insights, and editorial review. The time they kindly provided was vital to preparing the exchange and reviewing this final report.

Back to Top


INTRODUCTION

Purpose

This report provides highlights from a peer exchange held in Nashville, TN, on September 20-21, 2016. The exchange was held as part of the Federal Highway Administration’s (FHWA) Geographic Information Systems (GIS) in Transportation program1 and was hosted by the Tennessee Department of Transportation (TDOT). The purpose of the exchange was to highlight the role that organizational assessments can play in developing a comprehensive GIS strategy, by sharing experiences between State DOTs that have undergone organizational assessments and State DOTs that have limited experience with, or have not undergone, an assessment.

Background

A Capability Maturity Model (CMM) is defined by the Urban and Regional Information Systems Association (URISA) as “a tool to assess an organization’s ability to accomplish a defined task or set of tasks.2” A CMM usually involves a numerical ranking system to be used for comparison and analysis of the organization. The ranking system shows where the organization is on a “maturity scale” for the defined task(s). Maturity refers to the level at which the task can be performed.

A GIS-specific Capability Maturity Model was developed by URISA in 2009 and was first implemented as a self-assessment tool by Washington State GIS operators.3 The results from this implementation were discussed at URISA’s Annual Conference in 2010, and CMMs were subsequently adopted as an official URISA initiative. In 2011, the first comprehensive pilot for GIS CMM was carried out by Washington and Oregon State GIS managers, alongside the development of a straw-man draft called the URISA Geospatial Management Competency Model (GMCM). This straw-man draft process was critical to making the connection between professional GIS management practices and the management of an enterprise GIS operation.

Through the use of CMMs, GIS managers and the executives who oversee the deployment of resources for GIS will be able to have meaningful dialogues regarding the structure and characteristics of a mature, well-managed enterprise GIS. While many organizations can point to successes in implementing a GIS or geospatial tool for specific projects, the completion of these projects does not necessarily reflect the maturity-level of the organization’s GIS operations. GIS projects are often performed on an ad hoc basis to address immediate needs or requests. This practice results in ignoring GIS technology as a whole and only using them when needed, which limits their effectiveness to address overall program needs. Additionally, once a tool or project is completed, organizations often cannot continue to dedicate staff time or resources to maintenance and fine-tuning, reducing the tool or project’s benefits over time.

Undertaking a CMM evaluation is a time-intensive process, and a full version might not be appropriate for organizations with limited resources or time. For such organizations, a less intensive Organizational Assessment can be performed, which is an alternative scaled-down, compressed approach to CMMs.

Format

FHWA’s Office of Planning, Realty and Environment (HEP) sponsored the peer exchange with support from FHWA’s Office of Highway Policy Information (OHPI). The Tennessee Department of Transportation (TDOT) hosted the peer exchange in Nashville, TN. Participants included staff from TDOT, and representatives from Ohio Department of Transportation (Ohio DOT), North Carolina Department of Transportation (NCDOT), Iowa Department of Transportation (IDOT), Arizona Department of Transportation (ADOT), Michigan Department of Transportation (MDOT), and Oregon Department of Transportation (Oregon DOT). Allen Ibaugh, a representative of URISA, was also present.

The Peer Exchange was held over the course of two days. FHWA began the exchange by presenting an overview of the FHWA GIS in Transportation program and a summary of GIS Capability Maturity Models. This was followed by a series of presentations and roundtable discussions that addressed pre-identified topics of interest to both FHWA and the peers. The exchange concluded with a discussion of next steps and final remarks from FHWA that summarized recurring themes. See Appendix A for the peer exchange agenda, including roundtable discussion topics.

Overview of Peer Examples

The examples presented in the peer exchange focused on the use of CMMs/Organizational Assessments, and the attributes that support GIS maturity including “selling” GIS to upper management, GIS staff organization, data governance policies, and the role of Information Technology IT within GIS operations. Table 1 (below) provides an overview of the examples highlighted during the peer exchange.

Table 1. Overview of Examples

Agency Name Completed CMM or Organizational Assessment? Overview of GIS Strategy and Organization
Arizona DOT Yes ADOT is focused on separating GIS from IT, and utilizes contractors to perform a large amount of GIS work. A CMM revealed that ADOT has four primary challenges: limited skill sets and internal knowledge, insufficient training, data quality issues, and no data collection standards. ADOT currently has a small but flexible staff that perform data documentation.
Iowa DOT Yes IDOT follows a centralized GIS model, and has performed multiple CMMs. IDOT is performing a geospatial inventory, a data governance initiative, and working with an IT consultant on an e-vision project and an IDOT Open Data Portal. Part of IDOT’s goal is to continue to educate all users about GIS technologies.
Michigan DOT No Michigan is interested in learning from other State DOTs. Currently, MDOT utilizes GIS tools and services on a project-by-project basis.
NCDOT Yes NCDOT has implemented a hybrid governance model that is central coordination for NCDOT’s GIS platform, technology, and standards. This model also allows for support of different business units depending on their level of GIS need and expertise. The goal is to empower our customers as much as possible by making GIS ubiquitous within the enterprise, often without them realizing it.
Ohio DOT Yes Ohio DOT has performed multiple rounds of assessments starting in 2002. The State DOT’s GIS group has focused on defining the value of GIS to the agency, and implementing structured management processes within the GIS office. Ohio DOT also realized the importance of executive-level support through written policies, and the use of marketing to spread the word of GIS uses across the agency and to the public.
Oregon DOT No Oregon DOT is interested in learning from other State DOTs about the CMM process. The State DOT does not have a direct State GIS mandate but their work is critical in supporting and supplementing several state and federal reporting requirements. They want to determine where the GIS Unit can best support the agency.
Tennessee DOT No TDOT is trying to optimize their organizational structure and system architecture, and are interested in learning more about CMMs.

Back to Top


PEER EXCHANGE DISCUSSION HIGHLIGHTS

The peer exchange participants had a lively and meaningful dialogue that was aimed at exploring ideas, providing solutions for each other’s GIS-related issues, and sharing resources for addressing them. Overall, peers discussed the CMM process, as well as a series of challenges related to organizational assessments and CMMs. The discussions of the exchange were focused on:

  1. Capability Maturity Models & Organizational Assessments;
  2. GIS Awareness in Management;
  3. Organizing and Building a GIS Organization within an Agency; and
  4. Data Management.

A. Capability Maturity Models and Organizational Assessments

The focus of the peer exchange was in the lessons from two State DOTs’ (Iowa and Ohio) past organizational assessments. IDOT’s and Ohio DOT’s presentations went through their experiences with the assessments, how they benefited them, and offered recommendations and guidance to other State DOTs who were interested in performing CMMs.

B. GIS Awareness in Management

A common problem facing the peers was one of “selling” the value of GIS technologies to upper management. This problem exists because GIS can mean something different to everyone, making it somewhat difficult to clearly define what GIS is within the context of the organization. This ubiquity results in upper management not being able to see the direct applications of GIS tools and practices for their organization, and subsequently the direct benefits. Furthermore, executive-level and middle-level management can have misaligned goals. The message might be clear for upper management who are fluent in how GIS tools are applied and what they can produce, but not clear or acted out effectively by middle-management, who may not be as knowledgeable about GIS applications. Either way, the problem lies within the lack of awareness or knowledge of GIS technologies and their relevant applications for the agency.

C. Building and Organizing GIS within an Agency

A challenge experienced by all peers was choosing the GIS organizational structure that best fits their agency. The peers discussed the different models of GIS organizational structure that have been tried or are being planned. The peers also pointed out that each organization has different staffing needs, resources, and technical knowledge, therefore a one-size-fits-all organizational model would not be feasible. Rather, the organizational structure should be individualized for each agency, and informed by findings from CMMs and organizational assessments.

Participants found that two general organizational models prevailed: a centralized GIS department that is either a stand-alone department or one that sits within the IT organization, or a decentralized model where GIS users are dispersed throughout all departments (including IT). The peers also discussed how there are “two sides” of GIS: System Data Management (system architecture, and software & data support - essentially keeping GIS up and running) and Analysis & Planning (the business of using GIS software for projects).

Contractors are often a solution to staffing troubles. However, some State DOTs experience that the contracting process either limits them to too short of a contract timeline, or that the long-term contractor acquisition process is far too resource-intensive. In most cases, there are no options for anything in between these extremes. An ideal solution would be to set up a limited services option to pre-compete experienced GIS contractors.

D. Data Management

The lack of data management policies and systems was found to be a common problem for the majority of peers. GIS tools are an enabler of data use, which open up and help visualize multiple layers of data. It thus behooves all agencies using GIS to maintain structured data architecture and practices. “Structured” data is defined as data that is organized and actively maintained within an Enterprise Database system; “unstructured” data refers to all data that is not actively managed and organized. Multiple participants noted that unstructured data tends to waste significant resources (time, money and effort). Unfortunately, unstructured data is more common than structured. Data ends up scattered throughout the agency (on desktops, locked within IT or known only to certain users). The challenge for most State DOTs is therefore in breaking down siloes in data storage to arrive at consistent data standards.

Back to Top


Conclusion

The Peer Exchange yielded in-depth conversations about the strategies and actions that State DOTs can take to perform organizational assessments, and to improve their GIS operations. Additionally, the group discussed the possibilities for a workshop at GIS-T and collaboration with URISA to create a CMM specifically for State DOTs.

The key takeaways from this Peer Exchange include:

Back to Top


APPENDIX A: PEER EXCHANGE AGENDA AND PARTICIPANTS

FHWA Peer Exchange: GIS Capability Maturity Models (CMMs) and Organizational Assessments

Peer Participants

Peer Participants
Office Name Title Email Phone
ADOT Jim Meyer   JMeyer@azdot.gov 602-712-8037
MDOT Cory Johnson   JohnsonC45@michigan.gov 517-335-2931
Oregon DOT Brett Juul GIS Unit Manager Brett.A.Juul@odot.state.or.us 503-986-3156
Ohio DOT Ian Kidner GISP, GIS Systems Administrator, Office of Technical Services Ian.Kidner@dot.state.oh.us 614-466-2594
NCDOT John Farley Manager, GIS Unit JCFarley@ncdot.gov 919-608-6570
IDOT Eric Abrams   Eric.Abrams@dot.iowa.gov 515-239-1949
TDOT Kim McDonough   Kim.McDonough@state.tn.us 615-741-4037
URISA Allen Ibaugh Past-President, URISA; CEO, Data Transfer Solutions AIbaugh@dtsgis.com 407-382-5222
US DOT Participants
Office Name Title Email Phone
FHWA Mark Sarmiento Office of Planning, (HEPP) Mark.Sarmiento@dot.gov 202-366-4828
Volpe Center Michael Green   Michael.Green@dot.gov 617-494-2553
Anthony Lucivero   Anthony.Lucivero@dot.gov 617-494-2810

Peer Agenda

Tuesday, September 20
8:30 – 8:45 Welcome and Introductions – FHWA
8:45 – 9:15 Overview of USDOT GIS and Organizational Assessment Initiatives – FHWA
9:15 – 10:00 What do State DOTs new to organizational assessments want to learn?
  • Michigan DOT
  • Oregon DOT
  • Tennessee DOT
10:15 – 11:00 Demonstration/Presentation 1
  • Arizona DOT
11:00 – 11:45 Demonstration/Presentation 2
  • Iowa DOT
1:15 – 2:00 Demonstration/Presentation 3
  • North Carolina DOT
2:00 – 2:45 Demonstration/Presentation 4
  • Ohio DOT
3:00 – 4:00 Roundtable 1: Staffing Needs and Funding Challenges – All Participants
  • What staffing and funding challenges have you experienced?
  • How have you overcome or addressed them?
  • How have mandates affected your staffing challenges?
  • Is your GIS staffing centralized or decentralized? How does that affect your organization’s performance?
4:00 – 4:15 Day 1 Key Points/Wrap-Up – FHWA
Wednesday, September 21
8:00 – 8:15 Day 1 Re-cap – FHWA
8:15 – 9:00 Demonstration/Presentation 5
  • Allen Ibaugh
9:00 – 10:00 Roundtable 2: Capability Maturity Models – All Participants
  • How does an agency develop and implement a capability maturity model?
  • What are the benefits?
  • What are the barriers to implementation and how can an agency overcome them?
10:15 – 11:00 Roundtable 3: How to Conduct an Assessment – All Participants
  • How can agencies be successful at implementing the recommendations that come from organizational assessments?
  • How do you get buy-in from leadership?
  • What challenges have you faced and how have you overcome them?
11:00 – 11:45 Roundtable 4: Turning Assessments into Actions – All Participants
  • How can agencies be successful at implementing the recommendations that come from organizational assessments?
  • What does success look like?
11:45 - Noon Day 2 Key Points/Wrap-Up – FHWA

Back to Top


FOOTNOTES

1 Through technical support, resources, and capacity building opportunities, the FHWA GIS in Transportation program aims to assist transportation agencies to more effectively use GIS and geospatial applications. Additional information is available at https://www.gis.fhwa.dot.gov
2 URISA, September 2013, GIS Capability Maturity Model Guide
3 URISA, September 2013, GIS Capability Maturity Model Guide
4 Executive Summary, Final Report