Connecting Autodesk Construction Cloud to your BI in 2026
Pull BIM elements, RFIs, submittals, and clash data out of Autodesk Construction Cloud into a real BI workflow. Three approaches, honest trade-offs, and what actually scales.
By The iDBQuery Team
Autodesk Construction Cloud (ACC) is the new home of BIM 360, ACC Build, and the rest of Autodesk's project-delivery stack. The data inside it — model elements, RFIs, submittals, daily logs, clashes, schedules, photo markups — is genuinely valuable to a BI team. Getting it out of ACC and into a queryable form, in 2026, is harder than it should be.
This post walks through the three real approaches in 2026, where each one falls over, and what we recommend by team size.
What ACC actually exposes
ACC's data lives behind two surfaces:
- Autodesk Platform Services (APS) — REST APIs covering: project metadata, BIM 360 Document Management files, ACC Build issues / RFIs / submittals, model elements via the Model Derivative service, and webhooks for change events.
- ACC Connector / ACC Insight — Autodesk's own ETL and reporting layer, paid per-seat, surfaces a subset of the above into a Power BI / Tableau-friendly schema.
Both work. They're built for different people. APS is for engineers; Insight is for analysts who don't want to write code. The APS route gives you the most data and the most control. The Insight route is fastest to a dashboard if you accept its schema.
Approach 1: Roll your own with the APS REST APIs
The DIY path. You write a sync job that hits the relevant APS endpoints on a schedule, normalizes the responses into your warehouse, and points Power BI / Tableau / Looker at the warehouse.
What you build:
- OAuth 3-legged auth flow. Each user grants your app access to their hubs/projects. Refresh tokens last 14 days; you have to handle re-auth.
- Project enumeration. Walk hubs → projects → folders → items. Pagination at every level.
- BIM element extraction. Use the Model Derivative API to translate models into SVF2 + property database. The property DB is a SQLite file you have to parse to extract per-element properties (
element_id,category,family,type,level,material, …). Tens of thousands of elements per file is normal. - Issues / RFIs / submittals. ACC Build APIs. Roughly one HTTP call per project per entity type. Many are paginated at 100 per page.
- Webhooks. APS supports push notifications on file version changes. You set up a webhook callback and queue a re-sync per affected file.
- Per-file delta sync. When a file changes, fetch only that file's elements via
elementsByVersion(versionId). Re-pulling every element of every file on every change does not scale beyond ~5 small models.
This works. It costs a senior engineer roughly 3–6 weeks to get to "production-ready for one project," then 1–2 weeks per major addition (clash data, schedule data, daily logs). The maintenance burden is real — Autodesk ships breaking changes 1–2× per year.
When this is the right approach: you're building a custom platform on top of construction data, and the BI is a thin layer over a much bigger data product.
Approach 2: Use ACC Insight + Power BI
Autodesk's own analytics product. You enable Insight on the project, it pre-computes a schema, and you connect Power BI directly via the connector.
Pros:
- Working dashboards in days, not weeks
- Autodesk maintains the schema across breaking ACC changes
- Familiar Power BI authoring story for BI analysts
Cons:
- Per-seat licensing on top of the ACC seats you already pay for
- Schema is closed — what Insight exposes is what you get; if you want
element.material_idand Insight doesn't surface it, you're stuck - Cross-source joins (e.g. ACC + Maconomy + your weather API) are not first-class
- Power BI authoring still requires a person who knows DAX and the report builder
- BIM model viewing is separate; you switch tools to look at the 3D context for a question
When this is the right approach: large general contractors with a dedicated BI team who already lives in Power BI and is happy to stay there.
Approach 3: Use a multi-source AI assistant with native ACC support
The path that's emerged in 2026 alongside the rise of AI-assisted BI. Instead of building the sync layer yourself or paying for Insight + Power BI, you use a tool that already has the APS connector built and exposes everything as a chattable source.
iDBQuery's Autodesk APS integration handles the full pipeline: OAuth setup, hub/project enumeration, model element extraction, RFIs/submittals, clash data, webhook-driven delta sync, and per-file version tracking. The construction-vertical of the product (sub-brand: SiteMind) ships with BIM-aware chat that knows about element categories, levels, families, and clash relationships natively.
What this looks like in practice. After connecting your ACC hub:
"Show all RFIs older than 14 days, grouped by trade and project"
The AI generates the query against the ACC RFI data, joins to your project list, and returns a stacked bar chart. No sync code, no Power BI modeling, no DAX. The same chat can then ask:
"How many openings in the Tower-3 model are wider than 1.2 m and on Level 5?"
This hits the BIM element data — same chat, same conversation context. Element queries that would have required a custom Forge SDK script in 2022 are now one prompt.
Pros:
- No sync engineering. Connect the hub, ask questions.
- Native cross-source joins: ACC + ERP (Maconomy, Procore, Aconex) + GIS + contract documents in one chat
- AI-built dashboards from a chat prompt — no Power BI modeling
- BIM viewer embedded in the chat, so the 3D context is one click away
- Webhooks + per-file-version delta sync handled automatically
Cons:
- You're trusting a tool to maintain the connector — verify the vendor has shipped breaking-change updates within 2 weeks of past Autodesk releases
- If you're already invested in Power BI as the company-wide BI tool, this is a parallel surface, not a Power BI replacement
- Custom Enterprise pricing for production AEC deployments — free tier handles a single hub for evaluation
When this is the right approach: AEC firms that want construction analytics without staffing a data engineering team, especially when ACC is one of several systems being analyzed (the multi-source story is where this approach decisively wins).
Specific gotchas, regardless of approach
A few things bite every team that touches APS for the first time:
- Initial model translation is slow. A 200 MB Revit file takes 5–20 minutes to translate to SVF2. Schedule the first sync overnight; subsequent syncs are incremental.
- Property database extraction is undocumented at the edges. The
objects_attr,objects_val,objects_eavSQLite tables are how Autodesk normalizes properties — joining them correctly is non-obvious. Use a tested library or a managed connector. - Coordinates are in BIM space, not GIS space. If you want to overlay model data on a site map, you need the project's geo-reference matrix (often sketchily set in the model). Always validate against a known landmark.
- RFI status taxonomies vary by project. "Open / Answered / Closed" is the default; some projects use 8-state custom workflows. Don't hard-code statuses in your SQL.
- Webhooks fire late. APS webhooks have a typical 30–60 second delay and occasional drops. Combine push + a daily reconcile sweep.
When to pick what
| Team profile | Recommended approach |
|---|---|
| Solo developer building a custom AEC product | Roll your own (Approach 1) |
| 50-person GC with dedicated Power BI analyst | ACC Insight + Power BI (Approach 2) |
| 50-person GC without a dedicated BI team | Multi-source AI assistant (Approach 3) |
| Architect or engineering firm | Multi-source AI assistant (Approach 3) — usually you also have ERP + GIS data to mix in |
| Owner-developer running multiple projects | Multi-source AI assistant (Approach 3) — cross-project portfolio analysis is what you actually want |
Conclusion
In 2026, getting Autodesk Construction Cloud data into a real BI workflow is no longer a 6-week engineering project — unless you genuinely need a custom data platform, in which case build it yourself with APS. For everyone else, either Autodesk's own Insight + Power BI (if you're already a Power BI shop) or a multi-source AI assistant with native ACC support (if you want construction analytics without staffing a data team).
If you want to try the third approach: the construction vertical of iDBQuery is built specifically for AEC firms, with APS / Maconomy / Procore / Aconex / PostGIS all as first-class sources in one chat. Free tier handles a single hub for evaluation; production deployments are Custom Enterprise.
FAQ
Do I need to be an Autodesk Forge developer to use APS? For Approach 1, yes — you need to register an APS app, handle OAuth, and code against the REST APIs. For Approaches 2 and 3, the connector handles all of that; you click "Connect Autodesk" and grant access.
Does APS support BIM 360 (the old name) as well as ACC? Yes — APS is the unified platform for both. Some endpoints have separate paths for BIM 360 vs ACC projects; managed connectors abstract this.
What about Autodesk Vault or Fusion 360? Different APIs. APS covers Vault and Fusion 360 too, but most BI use cases are ACC-shaped. Vault is more often integrated with PLM systems than BI tools.
Can I push data BACK into ACC (e.g., create RFIs from a chart)? Yes — APS has write endpoints. Use this carefully; every write should have a per-action audit log. iDBQuery's ERP write-back follows the same opt-in-with-audit-log pattern for ACC writes.
What's the latency from "BIM file updated in ACC" to "data available in my BI"? Push (webhook) latency: 30–60 seconds typical. Translation latency for a changed model: 1–10 minutes depending on file size. End-to-end for "saved Revit file → queryable element data": 2–15 minutes.
Does it work with IFC files? Yes — APS supports IFC translation alongside Revit RVT. Element properties are exposed the same way. Coordinate systems are sometimes more reliably set in IFC than in Revit.
How much does APS cost? APS itself has a free tier covering 1,000 cloud credits / month — enough for a small project. Production usage runs roughly $0.001–$0.10 per element extraction depending on file complexity. Translation, viewer, and Model Derivative each have their own pricing.