Geotab

Demir at Geotab
Timeline
4 Months, September – December 2025
Overview
Systems Engineering Intern on the Enterprise Architecture team within Business Technology Operations. I restructured the application inventory, built data automations connecting the Dayforce API to Ardoq, deployed AI tooling for the team, and designed a phased governance roadmap to improve adoption across the organization.
Tools
Ardoq, Workato, Dayforce API, Google BigQuery, Jira, ROVO AI, Gemini

Background

Geotab connects over 4 million vehicles worldwide. Behind the fleet telematics products that customers see is a sprawling internal technology landscape: hundreds of applications, integrations, data pipelines, and vendor relationships that all need to be tracked, governed, and kept current. That tracking is the job of Enterprise Architecture.

The EA team uses Ardoq as the central system of record. When Ardoq is accurate, it powers planning, risk assessment, vendor decisions, security reviews, and intake approvals. When it's not, teams fall back to spreadsheets and direct messages, and the EA team absorbs the manual overhead.

I joined as a Systems Engineering Intern reporting to Dylan Finley, the Manager of Enterprise Architecture. My mandate was broad: improve data quality, restructure the application inventory, extend automation, and figure out why adoption across internal teams had stalled. What I found was more interesting than a typical data cleanup project.

Inventory Restructuring

Reclassify thousands of entries across Application Groups, Technology Products, and Application Modules with proper parent-child relationships.

Data Automation

Build Workato recipes connecting the Dayforce API to Ardoq, syncing organizational data and reducing manual drift.

AI Tooling

Deploy ROVO AI dashboards for request pattern analysis and set up AI assistants for team members using Gemini.

Governance Roadmap

Design a phased improvement plan grounded in adoption research to drive sustained accuracy and engagement.

Discovery

The first few weeks were about understanding the state of things. I had access to Ardoq, BigQuery, Jira, and the weekly EA meetings where the team reviewed intake requests and discussed outstanding issues. The picture that emerged was clear: the system of record had drifted away from reality.

The Inventory

I started with the application inventory. Thousands of entries across the Ardoq workspace. Many lacked owners. Lifecycle states hadn't been updated in months. Organizational units didn't match the source of truth in Google BigQuery. Applications had been renamed internally but their old entries were never retired. Others were entered by different contributors under slightly different names, creating duplicates that fragmented search results.

The data wasn't wrong everywhere. Where Workato automations were running, fields like names, titles, and manager information stayed accurate because they synced weekly. But the fields that depended on manual input (ownership, lifecycle states, naming) had drifted significantly. The duplicates weren't just cosmetic. Each one forced reviewers to spend time figuring out which entry was canonical, slowing down intake reviews and eroding trust in the system.

Why It Matters

EA tools depend on strong governance, clear ownership, and automated synchronization to remain accurate. Without these conditions, data drift occurs and adoption declines. When inventories fall out of sync with how an organization actually operates, they lose strategic value and teams stop relying on them.

I mapped the challenges I was seeing against established adoption frameworks. Inconsistent data lowered perceived usefulness, and manual updates raised perceived effort. Both discourage adoption. Automation didn't cover enough fields, ownership rules weren't enforced, and no workflow required people to keep entries current. Awareness was present, but reinforcement was missing. People knew Ardoq existed, but there were no consequences for letting entries go stale.

Stale Data

Manual fields drift

Low Trust

Teams doubt accuracy

Tool Bypass

Ask people instead

More Drift

Fewer updates filed

The rational response from contributors was predictable: bypass the tool and ask someone directly. That behavior pattern was the clearest signal that adoption had a structural problem, not a training problem.

Restructuring

The inventory wasn't just dirty. It was structurally flat. Everything was listed at the same level with no grouping, no parent-child relationships, and no distinction between business applications and developer tooling. My first major project was giving the inventory a proper architecture.

Before

├─Adobe Photoshop
├─Adobe Illustrator
├─Adobe InDesign
├─Salesforce
├─Salesforce CPQ
├─GitHub Actionsmisplaced
├─Terraformmisplaced
├─Figma Plugin Amisplaced
├─figma-plugin-amisplaced
... flat list continues

After

Application Groups
├─ Adobe Creative Cloud
├─ Photoshop
└─ Illustrator
└─ Salesforce
└─ Salesforce CPQ
Technology Products
├─ GitHub Actions
└─ Terraform
Modules
└─ Figma - Plugin A

Application Groups

Many entries in the inventory were individual products that belonged to larger suites, but they were scattered as standalone records. Adobe Creative Cloud, Atlassian, Salesforce, Google Workspace: each had their child products listed independently with no connection to the parent.

I created Application Groups in a Google Sheets mapping first, identifying every suite and its children across the full inventory. Then I restructured these in Ardoq under single parent entries with proper component relationships. This meant reviewers could now assess an entire suite at a glance during licensing or vendor reviews instead of piecing together individual components.

Technology Products

A more subtle structural problem was that developer tools were mixed in with business applications. GitHub libraries, internal frameworks, build tools: these are used by dev teams to build applications, but they aren't “business applications” in the same sense as Salesforce or Figma. Treating them the same distorted the portfolio and made it harder to get an accurate picture of the organization's software spend.

I worked on separating these into a distinct Technology Products workspace, creating grouping folders and defining what qualified as a technology product versus a business application. This was fairly open-ended and required judgment calls about categorization, but the result was a cleaner separation that gave both the EA team and engineering leadership a more accurate view of what they were actually managing.

Modules & Naming

The Application Module concept existed in Ardoq, but it wasn't being used consistently. Plugins and extensions were entered as top-level applications instead of modules under their parent. I identified all items that should be reclassified as Application Modules, converted them in Ardoq, assigned parent records, and made sure all fields from the Application workspace applied to modules as well.

Naming was the other half of this. Without a standard, the same plugin could appear under three different names. I introduced a convention: Application Name - Plugin Name. Every module entry now started with its parent application, making the hierarchy immediately visible in search results and alphabetical listings. A small change that cascaded into fewer duplicates, better search, and faster reviews.

Automation

Restructuring the inventory solved the structural problems, but it didn't address the ongoing data drift. For that, I turned to automation.

Dayforce Sync

The existing Workato recipes synced organizational data from Google BigQuery into Ardoq on a weekly schedule. I extended this by connecting org data updates using the Dayforce API, Geotab's HR system. This involved requesting access to the Dayforce Organization API through People Operations, then building Workato recipes that used REST API calls and JSON to pull employee data and push it into Ardoq, following the patterns Dylan had established.

The sync kept names, titles, emails, managers, and organizational units aligned with the HR source of truth automatically. A separate decommission workflow handled archived applications by updating their lifecycle states. Together, these automations meant that the fields they covered stayed accurate without anyone having to remember to update them.

Dayforce API

HR & org chart data

Google BigQuery

Employee source of truth

Workato

REST API + JSONOrg SyncDecommission

Ardoq

System of record

Applications

Groups, modules

Tech Products

Dev tooling

Coverage Gaps

But automation only covered part of the picture. Active lifecycle states, ownership assignments, and naming conventions still depended entirely on manual input. The pattern was consistent: where automation existed, data quality was high. Where it didn't, drift accumulated.

Automated fields

NameBigQuery
TitleBigQuery
EmailBigQuery
ManagerDayforce
Org UnitDayforce
Lifecycle (Archived)Workato

Manual fields

Ownershiphigh drift
Lifecycle (Active)high drift
Namingsome drift

Partial automation actually increases inconsistency because the manual gaps become failure points. People assumed Ardoq was accurate because some fields were always right, which made the inaccurate fields more dangerous. The lesson wasn't that manual processes are bad. It's that partial automation is worse than no automation because it creates a false sense of completeness.

AI & Tooling

Beyond the inventory and automation work, I helped the team adopt AI tooling to reduce manual effort and surface insights that weren't visible before.

ROVO AI

Dashboards for the software request service desk. Surfaced request patterns and duplicate clustering across Jira tickets.

Gemini

Workflow assistant for Daiane, helping with day-to-day tasks, process documentation, and data entry guidance.

Onboarding Agent

AI agent for new contributors, covering naming conventions, Ardoq navigation, and common data quality mistakes.

ROVO Dashboards

I built dashboards using Atlassian ROVO AI to give the team visibility into the software request service desk. Instead of manually scanning Jira tickets, the team could now see request patterns at a glance: which tool categories had the most submissions, where duplicate requests were clustering, and how change requests flowed through the review process.

The CR request analysis was particularly revealing. Certain categories (screen capture tools, PDF editors, design plugins, browser extensions) saw repeated submissions from different teams, often for tools that already existed in the portfolio or had approved alternatives. This data directly informed the recommendation to integrate Ardoq checks into the software intake workflow.

AI Assistants

I also worked on deploying AI assistants for individual team members. I set up a Gemini-based assistant for Daiane to help with her day-to-day workflows, and connected with Iryna to explore building an AI onboarding agent that could help new contributors understand how to work with Ardoq, follow naming conventions, and avoid common data quality mistakes.

These were smaller initiatives, but they reflected a broader pattern: the EA team's challenges weren't just about data cleanup. They were about reducing the cognitive load on a small team managing a system that the entire organization depended on. Every hour saved through automation or AI was an hour that could go toward governance and strategic work instead of manual maintenance.

Software Intake

Jira Request

Team submits software request

No Ardoq check

Reviewer manually searches

Approve

New Ardoq entry

Duplicate?

Depends on reviewer

Reject

No record kept

The ROVO analysis surfaced the most impactful structural finding of the term: the software request workflow didn't check Ardoq at all. Requests came through Jira, and reviewers manually searched Ardoq to check for existing tools. This created delays, inconsistency (different reviewers found different things), and meant that requesters had no visibility into what was already approved.

This was a facilitating conditions failure. Ardoq wasn't woven into the workflow that mattered most. Architecture tools gain adoption when they're tied directly to strategic decision-making. Intake is the single most important decision point for application governance. Requests that could be validated automatically were instead being reviewed manually, and the tool that should have been central to that decision was only consulted as an afterthought.

The Roadmap

The fixes I implemented during the term (restructuring, naming conventions, automation, AI tooling) were immediate improvements, but they addressed symptoms. The deeper changes required organizational commitment beyond a single co-op term. I designed a four-phase roadmap grounded in the adoption research and presented it to the EA team.

01

Stabilize Data Quality

Expand Workato automation to cover lifecycle states and ownership. Automate transitions based on defined triggers. Source ownership from the HR dataset.

02

Clarify Ownership

Assign explicit owners and managers to every Ardoq entry. Document expectations. Make individuals accountable for updates.

03

Improve Onboarding

Create task-based guides with real Geotab examples. Cover adding applications, updating lifecycle states, following naming conventions, and avoiding duplicates.

04

Integrate Intake

Build an automated Ardoq check into the software request workflow. Validate application names and detect duplicates before reviewers see the request.

Each phase builds on the previous one. Stable data makes ownership meaningful. Clear ownership makes onboarding possible. Strong onboarding makes intake integration effective. Skipping phases would recreate the same problems in a different form.

Takeaways

Structure before data

The inventory needed its own architecture before individual entries could be cleaned. Without it, cleanup was just rearranging noise.

Partial automation misleads

When some fields are always right and others are stale, people can't tell the difference. They trust everything or nothing.

Adoption follows integration

The biggest unlock was tying Ardoq into the intake workflow where decisions actually happen. Tools in the critical path get used.

Leverage over process

ROVO dashboards, Gemini assistants, and automated sync recipes let a small team operate at the scale the organization demanded.