Your customers already live inside QuickBooks, Xero, NetSuite, or Sage Intacct. They do not want to export CSVs. They do not want to re-key invoices. They want your product to read and write directly to their general ledger, and they want it to feel invisible.
This guide is the code-first reference for building accounting API integration into your product. Not the product evaluation ("which platform should we support?"), not the architecture overview ("what is a unified API?"), but the implementation details: how to authenticate against each provider, how to normalize their data models into a single schema, how to sync reliably, and how to handle the errors that will absolutely happen in production. Examples are in Python and Node.
If you're earlier in the decision process, Apideck's accounting software integration guide covers the architectural trade-offs and the build vs. buy analysis breaks down the cost math.
How does accounting integration work differently for vertical vs. horizontal SaaS?
Both horizontal and vertical SaaS products need accounting integrations. The difference is depth.
Horizontal products (CRM, project management, billing) typically push data in one direction. A CRM creates an invoice in QuickBooks when a deal closes. A subscription billing platform syncs monthly charges. The data model is flat, and the mapping is straightforward.
Vertical products go further. A construction management platform syncs job cost data to the customer's GL with specific cost codes. A veterinary practice management system pushes invoices with line items mapped to revenue accounts by service type. The integration touches more objects, runs bidirectionally, and requires industry-specific GL mapping that horizontal products rarely deal with.
For horizontal products, accounting integration is a feature that improves retention. For vertical products, it's a procurement checkbox. Enterprise buyers run evaluation checklists, and "Connects to NetSuite" appears as a hard requirement. Without it, you don't make the shortlist.
What does authentication look like in code?
Every accounting API uses some variant of OAuth or token-based auth. The concepts are similar, but the implementations differ enough that you'll end up writing separate auth handling for each provider.
QuickBooks Online: OAuth 2.0 with rotating refresh tokens
import requests
from datetime import datetime, timedelta
class QBOAuth:
TOKEN_URL = 'https://oauth.platform.intuit.com/oauth2/v1/tokens/bearer'
def __init__(self, client_id, client_secret, realm_id):
self.client_id = client_id
self.client_secret = client_secret
self.realm_id = realm_id
def refresh_access_token(self, refresh_token):
"""
Access tokens expire in 60 minutes.
Refresh tokens are valid for up to 5 years (changed from
the previous 100-day inactivity window in November 2025),
but may rotate on every exchange. Always persist the new
refresh_token from each response.
"""
response = requests.post(self.TOKEN_URL, data={
'grant_type': 'refresh_token',
'refresh_token': refresh_token,
}, auth=(self.client_id, self.client_secret))
tokens = response.json()
# CRITICAL: persist the new refresh token immediately.
# The old one may already be invalidated.
return {
'access_token': tokens['access_token'],
'refresh_token': tokens['refresh_token'],
'expires_at': datetime.utcnow() + timedelta(seconds=tokens['expires_in']),
}
Key details: access tokens last 60 minutes. Refresh tokens may rotate every 24-26 hours, so your system must persist the latest refresh token after every exchange. If a user disconnects your app from their QuickBooks company, all tokens are invalidated immediately regardless of expiry.
Xero: OAuth 2.0 with tenant discovery
class XeroAuth:
TOKEN_URL = 'https://identity.xero.com/connect/token'
CONNECTIONS_URL = 'https://api.xero.com/connections'
def refresh_access_token(self, refresh_token, client_id, client_secret):
"""
Access tokens expire in 30 minutes (shorter than QBO's 60).
After auth, you must call /connections to discover which
tenants the user granted access to. Each tenant requires
its own xero-tenant-id header on every API call.
"""
response = requests.post(self.TOKEN_URL, data={
'grant_type': 'refresh_token',
'refresh_token': refresh_token,
'client_id': client_id,
'client_secret': client_secret,
})
tokens = response.json()
return tokens
def get_tenants(self, access_token):
"""A single user may have access to multiple Xero orgs."""
response = requests.get(self.CONNECTIONS_URL, headers={
'Authorization': f'Bearer {access_token}',
})
return response.json() # List of {tenantId, tenantName, ...}
Xero's rate limits are aggressive: 60 calls per minute per tenant, 5,000 per day per tenant, and a 10,000-per-minute app-wide cap. If you serve accounting firms with multiple Xero organizations, rate limit management becomes an architectural concern, not an afterthought.
NetSuite: OAuth 1.0a with HMAC-SHA256 signatures
import hmac, hashlib, base64, time, uuid
from urllib.parse import quote
class NetSuiteAuth:
def generate_auth_header(self, account_id, consumer_key, consumer_secret,
token_id, token_secret, http_method, url):
"""
Every NetSuite request requires computing an HMAC-SHA256
signature (SHA1 support ended in NetSuite 2023.1).
Debugging signature mismatches is a known time sink.
"""
nonce = uuid.uuid4().hex
timestamp = str(int(time.time()))
base_string = '&'.join([
quote(http_method.upper(), safe=''),
quote(url, safe=''),
quote(f'oauth_consumer_key={consumer_key}'
f'&oauth_nonce={nonce}'
f'&oauth_signature_method=HMAC-SHA256'
f'&oauth_timestamp={timestamp}'
f'&oauth_token={token_id}'
f'&oauth_version=1.0', safe=''),
])
signing_key = f'{quote(consumer_secret, safe="")}&{quote(token_secret, safe="")}'
signature = base64.b64encode(
hmac.new(signing_key.encode(), base_string.encode(), hashlib.sha256).digest()
).decode()
return (f'OAuth realm="{account_id}",'
f'oauth_consumer_key="{consumer_key}",'
f'oauth_token="{token_id}",'
f'oauth_signature_method="HMAC-SHA256",'
f'oauth_timestamp="{timestamp}",'
f'oauth_nonce="{nonce}",'
f'oauth_version="1.0",'
f'oauth_signature="{quote(signature, safe="")}"')
NetSuite also imposes governance limits: each API request consumes governance points, and your integration competes for capacity with the customer's own SuiteScripts. If your customer runs heavy customizations (common in mid-market NetSuite deployments), this throttling is unpredictable.
Sage Intacct: session-based XML auth
import xml.etree.ElementTree as ET
class SageIntacctAuth:
API_URL = 'https://api.intacct.com/ia/xml/xmlgw.phtml'
def get_session(self, sender_id, sender_password, company_id,
user_id, user_password):
"""
Sage Intacct uses XML requests. You POST credentials,
get a session ID, and include it in subsequent requests.
Sessions expire, so you need retry logic that detects
expiry and re-authenticates transparently.
"""
payload = f'''<?xml version="1.0" encoding="UTF-8"?>
<request>
<control>
<senderid>{sender_id}</senderid>
<password>{sender_password}</password>
<controlid>{uuid.uuid4().hex}</controlid>
<uniqueid>false</uniqueid>
<dtdversion>3.0</dtdversion>
</control>
<operation>
<authentication>
<login>
<userid>{user_id}</userid>
<companyid>{company_id}</companyid>
<password>{user_password}</password>
</login>
</authentication>
<content>
<function controlid="get_session">
<getAPISession/>
</function>
</content>
</operation>
</request>'''
response = requests.post(self.API_URL, data=payload,
headers={'Content-Type': 'application/xml'})
root = ET.fromstring(response.text)
return root.find('.//sessionid').text
The common pitfall across all four providers is treating auth as a one-time setup problem. In production, auth is an ongoing operational concern. Tokens expire. Users revoke access from within their accounting platform. API providers deprecate auth mechanisms. Your integration needs to detect failures, retry intelligently, and surface clear error messages when re-authentication is required. Apideck's guide to accounting API integrations for fintech goes deeper on the edge cases.
How do you normalize data models into a single schema?
An invoice in QuickBooks Online is structurally different from an invoice in Xero, which differs again from an invoice in NetSuite. Your application shouldn't care. The standard approach is a canonical data model with a translation layer for each provider.
from dataclasses import dataclass
from decimal import Decimal
from datetime import date
from typing import Optional
@dataclass
class LineItem:
description: str
quantity: Decimal
unit_price: Decimal
account_code: str # Maps to provider-specific GL account
tax_rate_id: Optional[str] = None
@dataclass
class Invoice:
customer_id: str # Your internal reference
line_items: list[LineItem]
currency: str # ISO 4217
due_date: date
tax_inclusive: bool = False
external_id: Optional[str] = None # Provider's invoice ID
Then a translator per provider. Here's what the QuickBooks Online and Xero translations look like side by side:
class QBOTranslator:
def to_provider(self, invoice: Invoice) -> dict:
return {
'CustomerRef': {'value': self.resolve_customer(invoice.customer_id)},
'Line': [{
'Amount': float(item.quantity * item.unit_price),
'DetailType': 'SalesItemLineDetail',
'SalesItemLineDetail': {
'ItemRef': {'value': self.resolve_item(item.account_code)},
'Qty': float(item.quantity),
'UnitPrice': float(item.unit_price),
}
} for item in invoice.line_items],
'DueDate': invoice.due_date.isoformat(),
}
class XeroTranslator:
def to_provider(self, invoice: Invoice) -> dict:
return {
'Type': 'ACCREC',
'Contact': {'ContactID': self.resolve_contact(invoice.customer_id)},
'LineItems': [{
'Description': item.description,
'Quantity': float(item.quantity),
'UnitAmount': float(item.unit_price),
'AccountCode': item.account_code,
'TaxType': self.resolve_tax(item.tax_rate_id),
} for item in invoice.line_items],
'DueDate': invoice.due_date.isoformat(),
'LineAmountTypes': 'Inclusive' if invoice.tax_inclusive else 'Exclusive',
}
Notice the differences: QuickBooks Online uses CustomerRef with a value ID, while Xero uses Contact with a ContactID. QuickBooks Online calculates line amounts from quantity and unit price, while Xero expects them separately. QuickBooks Online uses SalesItemLineDetail with an ItemRef, while Xero maps directly to an AccountCode. Tax handling is different: QuickBooks Online handles tax at the transaction level, Xero expects it per line item.
The resolve_customer, resolve_item, and resolve_contact methods are where you map your internal IDs to provider-specific IDs. This mapping table is essential: maintain a local cache of provider entity IDs and refresh it periodically. When a mapping is missing, your pre-validation layer catches it before the API call fails.
This architecture isolates provider-specific complexity. Adding a new accounting platform means writing a new translator, not modifying your core product.
How should you handle sync and webhook reliability?
The ideal: your customer creates a payment in QuickBooks, and your app knows within seconds. The reality varies by provider.
QuickBooks Online supports webhooks but only sends change notifications, not payloads. You get told that an Invoice was updated for a specific realm, then fetch the actual data in a follow-up call. Webhooks aren't guaranteed to arrive in order.
Xero webhooks work similarly: event type plus tenant ID, payload fetched separately.
NetSuite has limited webhook support. Most integrations poll via saved searches or RESTlets on a cron schedule, with latency measured in minutes.
Sage Intacct has no webhook support. Polling is the only option.
Given this, a production sync architecture combines webhooks (where available) with scheduled polling as a safety net and idempotent writes everywhere:
// Node.js: Idempotent invoice sync with conflict detection
async function syncInvoice(providerInvoice, provider) {
const externalId = `${provider}:${providerInvoice.id}`;
const existing = await db.invoices.findByExternalId(externalId);
// Skip if we already have a newer version
if (existing && existing.updatedAt >= providerInvoice.updatedAt) {
return { status: 'skipped', reason: 'already_current' };
}
const translator = translators.get(provider);
const canonical = translator.toCanonical(providerInvoice);
if (existing) {
await db.invoices.update(existing.id, canonical);
return { status: 'updated', id: existing.id };
}
const created = await db.invoices.create({ ...canonical, externalId });
return { status: 'created', id: created.id };
}
// Webhook handler: fetch full record, then sync
async function handleWebhook(event) {
const { provider, entityType, entityId, realmId } = event;
const client = await getProviderClient(provider, realmId);
const record = await client.fetch(entityType, entityId);
return syncInvoice(record, provider);
}
// Scheduled poll: catch anything webhooks missed
async function pollForChanges(provider, realmId, lastSyncTime) {
const client = await getProviderClient(provider, realmId);
const changed = await client.getModifiedSince(lastSyncTime);
const results = [];
for (const record of changed) {
results.push(await syncInvoice(record, provider));
}
return results;
}
The key pattern: prefix external IDs with the provider name (qbo:12345, xero:abc-def) so you can sync the same customer across multiple accounting platforms without ID collisions. And always run the poll even when webhooks are working. Webhooks are a performance optimization, not a reliability guarantee. For a comprehensive view of sync architecture, check out the accounting integration overview.
What errors will you hit in production, and how should you handle them?
Integration guides love the happy path. Production code lives on the unhappy path.
Rate limits
QuickBooks Online returns HTTP 429 above 500 requests per minute per realm. Xero returns 429 with a Retry-After header at 60 calls per minute per tenant. NetSuite returns governance errors when your requests exceed the customer's available capacity. Your retry logic needs exponential backoff with jitter, and it needs to be per-tenant:
import asyncio
import random
async def call_with_backoff(api_call, tenant_id, max_retries=5):
for attempt in range(max_retries):
try:
return await api_call()
except RateLimitError as e:
if attempt == max_retries - 1:
raise
# Per-tenant backoff: don't punish all customers for one tenant's limit
base_delay = min(2 ** attempt, 60)
jitter = random.uniform(0, base_delay * 0.5)
delay = base_delay + jitter
logger.warning(f'Rate limited for tenant {tenant_id}, '
f'retry {attempt + 1}/{max_retries} in {delay:.1f}s')
await asyncio.sleep(delay)
Backing off globally because one customer hit a rate limit is a design bug. Scope your retry state to the tenant.
Validation failures
Accounting systems enforce strict rules on transactions. An invoice in QuickBooks Online requires a valid CustomerRef. If the customer record doesn't exist, the API rejects the call. In Xero, posting against an inactive account fails silently with a validation error. In NetSuite, a missing custom segment returns an error that names the segment but not the valid values.
The right approach is pre-validation: before writing, check that referenced entities (customers, accounts, tax codes, items) exist and are active in the target system.
async def pre_validate_invoice(invoice: Invoice, provider_client) -> list[str]:
errors = []
# Check customer exists in provider
customer = await provider_client.get_customer(invoice.customer_id)
if not customer:
errors.append(f'Customer {invoice.customer_id} not found in accounting system')
elif not customer.active:
errors.append(f'Customer {invoice.customer_id} is archived/inactive')
# Check each line item's account code
for i, item in enumerate(invoice.line_items):
account = await provider_client.get_account(item.account_code)
if not account:
errors.append(f'Line {i+1}: account code {item.account_code} not found')
if item.tax_rate_id:
tax = await provider_client.get_tax_rate(item.tax_rate_id)
if not tax or not tax.active:
errors.append(f'Line {i+1}: tax rate {item.tax_rate_id} inactive or missing')
return errors
This turns hard API failures into soft pre-flight checks. Your users get actionable error messages ("Account code 4100 is inactive in Xero") instead of cryptic provider responses.
Partial batch failures
If you're syncing 50 invoices and 3 fail validation, commit the 47 successes, surface the 3 errors with enough context for someone to fix them, and allow retry. Distinguish between user-fixable problems ("TAX_EXEMPT is not active in this Xero organization") and infrastructure failures ("Rate limit exceeded, will retry automatically").
Auth failures
A 401 could mean the access token expired (refresh and retry), the user revoked access from inside their accounting platform (prompt re-authentication), or the API credentials are invalid (alert your engineering team). Each needs a different response path.
How does a unified API reduce this work?
Everything above (auth per provider, data model translation, rate limit handling, webhook plumbing, error mapping) is table stakes for a single accounting platform. Multiply it by four, and you're looking at a dedicated integration team doing mostly maintenance.
A unified accounting API compresses this. You write one integration, and the provider handles the per-platform translation. The same invoice creation call works for QuickBooks, Xero, NetSuite, and Sage Intacct:
import { Apideck } from '@apideck/node';
const apideck = new Apideck({
apiKey: process.env.APIDECK_API_KEY,
appId: process.env.APIDECK_APP_ID,
consumerId: 'customer-123'
});
// This works identically across all supported accounting platforms
const response = await apideck.accounting.invoicesAdd({
invoice: {
customer: { id: customerId },
line_items: [{
description: 'Consulting hours',
quantity: 5,
unit_price: 50.00
}],
due_date: '2026-05-15'
}
});
The auth UX is also unified. Instead of building separate OAuth flows for each provider, you embed a single connection component (Apideck calls it Vault) that lets your customers select their accounting platform and authorize access. The UI is white-labeled to match your product.
This doesn't eliminate all direct integration work. If your product needs provider-specific features (NetSuite custom segments, QuickBooks Online class tracking), you may need passthrough access alongside the unified layer. But for the common accounting objects (invoices, payments, contacts, accounts, journal entries), a unified API turns months of per-platform engineering into a single integration. The QuickBooks Online API guide shows the contrast in detail.
What should you build first?
Start with reads. Pull invoice and payment data from your customer's accounting system, validate the data mapping, and build confidence before you write to the GL. Reads are lower risk and surface mapping issues early.
Start with the most common objects. Invoices, contacts, and payments cover the majority of vertical SaaS use cases. Chart of accounts and tax rates are reference data you'll need for mapping. Journal entries are powerful but harder to get right, so save them for v2.
Build your canonical data model from day one, even if you're starting with a single provider. The translation layer costs minimal extra effort upfront and saves weeks when you add the second platform.
If you're supporting more than one accounting platform, evaluate whether a unified API fits before committing to direct integration for each. The engineering time saved on platforms two through five is where the ROI shows up. For many vertical SaaS teams, getting to multi-platform support in weeks instead of quarters is the difference between winning and losing deals.
Ready to get started?
Scale your integration strategy and deliver the integrations your customers need in record time.








