Search

Microsoft Great Plains Integration Manager Advanced Techniques

0 views

Understanding Great Plains Integration Options

Great Plains, now part of the Microsoft Dynamics family, serves as a robust ERP solution for many mid‑market businesses. The system offers several paths to move data in and out of the database, and choosing the right path depends on transaction volume, data complexity, and the skill set available within your team.

The first and most straightforward route is SQL scripting. Stored procedures and views give you fine control over how data is retrieved or updated. You can write logic directly in the database layer, which means every transaction hits the database in a single, optimized call. This approach is especially handy when you have a small number of high‑impact changes, such as updating a master record or posting a batch of invoices.

For developers who prefer a programmatic environment, ADO.NET provides a bridge between .NET code and the Great Plains data model. By connecting to the database with a SqlConnection, you can execute commands, read data readers, and bind objects in a familiar, strongly typed fashion. ADO.NET also supports transaction scopes and parameterized queries, which add safety and performance to the integration process.

A third option lies in SQL Server Integration Services (SSIS). Formerly known as DTS, SSIS packages let you create a workflow that extracts data from external sources, transforms it, and loads it into Great Plains. SSIS is ideal when you need to orchestrate complex data flows involving multiple steps, such as cleaning legacy data, mapping fields, and handling errors. The visual design surface makes it easy to share and maintain packages across teams.

When you step away from direct database manipulation, you encounter the eConnect SDK. eConnect exposes a set of web services that mimic the business objects in Great Plains. Clients can construct JSON or XML payloads that represent invoices, vendors, or general ledger entries, and send them to the eConnect endpoint. The SDK validates data against Great Plains rules before committing, reducing the risk of corrupting the system. However, because each call involves web service overhead, eConnect is best suited for moderate throughput.

All of these methods work well in isolation, but many organizations build a hybrid strategy. For instance, they may use SSIS to pull data from a legacy system, then transform it into eConnect calls that insert or update the Great Plains database. Alternatively, a small batch of low‑volume, high‑complexity transactions might go through a custom VBA script in Integration Manager, while the bulk of the load goes straight to the database via stored procedures.

The choice ultimately hinges on performance, maintainability, and the skill set your team brings to the table. If you are comfortable writing T‑SQL and can tune indexes effectively, the database layer offers unmatched speed. If your team is more familiar with .NET and needs to build reusable components, ADO.NET or eConnect may fit better. And if you need a visual workflow that handles data cleansing and error logging, SSIS is the tool of choice.

Building a User‑Friendly Tool with Integration Manager and VBA

Integration Manager, part of the Great Plains suite, allows you to build lightweight adapters that read from flat files and push data into the system. Although it is not as fast as a direct database approach, its simplicity makes it attractive for low‑volume scenarios where you want to avoid complex development.

Start by launching Integration Manager and opening the integration project you plan to modify. In the integration editor, right‑click the project node and select Properties. Switch to the Scripts tab and open the Before Document script. Paste the following code to set a custom document date and to reject any records that lack mandatory fields:

If Left(SourceFields("F4"), 4)="THRU" Then

SetVariable "DocDate", Right(SourceFields("F4"), 8)

End If

If IsNull(SourceFields("F1")) Or IsNull(SourceFields("F7")) Then

CancelDocument

End If

This snippet first checks whether the fourth field starts with "THRU". If it does, it extracts the last eight characters of that field and stores them in a variable called DocDate. Then it verifies that both field F1 and field F7 contain data; if either is missing, the document is aborted, preventing incomplete records from slipping into the system.

Next, open the Before Document Commit script. Here you build the document number, set the posting date, and populate check details. Insert the following code:

Dim MyTime, MyHour

MyTime = Now

MyHour = Hour(MyTime)

If Not IsNull(GetVariable("DocDate")) then

SetVariable "DocNumber", GetVariable("DocNumber")+1

DestinationFields("Document Number")="THRU"+GetVariable("DocDate") &"A" & GetVariable("DocNumber")

DestinationFields("Posting Date")=GetVariable("DocDate")

DestinationFields("Description")="Client Check "+ SourceFields("F1")

If Not IsNull(SourceFields("F8")) Then

DestinationFields("Check.Check Number")=SourceFields("F8")

Else

DestinationFields("Check.Check Number")=GetVariable("DocDate")& MyHour&GetVariable("DocNumber")

End If

DestinationFields("Check.Date")=GetVariable("DocDate")

If Month(DestinationFields("Check.Date")+1)= Month(DestinationFields("Check.Date")) Then

DestinationFields("Check.Date")=DestinationFields("Check.Date")+1

End If

End If

In this block, you start by grabbing the current time and hour. Then you confirm that DocDate is available. The script increments a DocNumber counter, constructs a composite document number that includes the date and a letter "A", and writes it to the destination field. The posting date is set to DocDate, and a description is built from a static phrase plus the value from field F1.

The check number logic uses a conditional check: if field F8 contains a value, it uses that as the check number; otherwise, it creates a synthetic number by combining DocDate, the current hour, and an incremented DocNumber. The check date is set to DocDate, and an adjustment is made to ensure that if the month rolls over, the date is corrected accordingly.

Finally, add a clean‑up step in the After Integration script:

ClearVariables

Clearing variables after each integration run frees memory and guarantees that stale values do not leak into subsequent documents. This simple routine helps keep the tool lean and prevents unexpected behavior in future runs.

Because Integration Manager processes documents one at a time, the runtime grows linearly with the number of records. For small batches - roughly a hundred transactions per day - this tool works well. If you start handling thousands of records, consider moving the logic to a stored procedure or a Dexterity server process, where database engine optimizations can drastically reduce execution time.

Optimizing Performance and Choosing the Right Approach

When deciding between Integration Manager, stored procedures, or a Dexterity process server, the primary driver is transaction volume. Integration Manager shines with simple, low‑volume loads that require custom filtering or transformation before data reaches the database. Because it runs inside the Great Plains client, it has quick access to the data model but suffers from the overhead of loading the entire integration engine for each run.

Stored procedures, by contrast, execute entirely within the SQL Server engine. They can process thousands of records in a single batch, use index seeks, and avoid the round‑trip cost of an application layer. For tasks like bulk invoice posting or mass vendor updates, writing a set of T‑SQL procedures is often the most efficient route. The trade‑off is that you must maintain the procedures in a separate version‑control repository and keep the database schema in sync with any business changes.

Dexterity, the custom process server for Great Plains, sits between the two extremes. It allows developers to write server‑side code in a strongly typed language, yet the code runs in the same environment that Great Plains uses for business logic. This gives you the flexibility of application code while keeping the performance advantages of server execution. When you need to perform complex calculations, interact with external services, or maintain an internal cache, a Dexterity process can be a good fit.

Regardless of the method, there are universal tuning practices that help keep integration fast. Index the fields you filter on, such as the document number or posting date. Batch inserts using bulk APIs or staging tables to avoid row‑by‑row commits. Enable query plan caching to reduce parsing overhead. And always profile the integration with SQL Server Management Studio’s activity monitor or Dynamic Management Views to spot bottlenecks.

Beyond raw performance, consider maintainability. A script in Integration Manager is easy to modify and deploy without recompiling the entire application. A stored procedure requires a deployment script and DBA involvement. A Dexterity process demands more complex build and release steps but offers deeper integration with Great Plains business rules. Balance these factors against your organization’s release cycle and skill set to choose the optimal tool for each integration scenario.

Vendor Translation and Mapping Techniques

Many businesses use Great Plains to track accounts payable, but the vendor records they import often come from external systems with different identifiers. A common challenge is mapping these source vendor codes to the internal Great Plains vendor IDs. Integration Manager offers a translation facility that lets you replace the imported ID with the correct one before the record reaches the database.

To set up a translation, open the destination mapping editor for your AP integration. Select the Vendor ID field and click the rule properties button. When prompted, answer “No” to the “Use existing translation” question, then switch to the Translation tab. Here you paste a list of source-to-target mappings copied from Excel or another spreadsheet:

JOHNB BANKOFAMERICA

PETERP BANKOFAMERICA

MARTHAM CITIBANK

When the integration runs, the engine scans each incoming record’s Vendor ID. If it finds a match in the translation table, it replaces the original code with the mapped Great Plains vendor ID. This simple replacement step ensures that checks, payments, or expense reports are posted against the correct bank account rather than a placeholder vendor.

Translation tables can be stored as separate files, in a database table, or even hard‑coded in a script. For larger, dynamic mapping scenarios, storing the table in the database and querying it during the integration run can simplify updates. You only need to reload the table when vendor relationships change, avoiding the need to touch the integration project itself.

While vendor translation addresses a specific need, the same concept applies to any field that requires look‑ups or canonicalization. Product codes, location identifiers, or tax categories can all benefit from a translation layer that normalizes incoming data before it touches the core system.

After implementing translation, test the integration with a representative sample of records. Verify that the mapped vendor appears correctly in the posting journal, and that the original source code no longer shows up in the database. This validation step catches any mismatches or typos in the translation file early, preventing downstream errors in payment processing.

Andrew Karasev, Chief Technology Officer at Alba Spectrum Technologies, emphasizes that the power of a well‑designed translation layer lies in its ability to keep the business side of an organization aligned with the data layer. By ensuring that vendor records, tax codes, and other reference data are consistently mapped, the organization reduces manual data cleanup and improves audit trails. For more details on implementing custom integrations, visit

Suggest a Correction

Found an error or have a suggestion? Let us know and we'll review it.

Share this article

Comments (0)

Please sign in to leave a comment.

No comments yet. Be the first to comment!

Related Articles