Sage 200 UK

Introduction

The Templates installation provides you with a ready-to-use set of Views, Dashboards and Reports.

The installation (assuming the Central Point is freshly installed and empty), will consist of two steps:

  • Configuring the Data Sources for the ERP and Data Warehouse (optional) databases.
  • Importing the Templates into the Central Point.

There may be other steps including building and loading the OLAP Cubes.

Nectari DataSync

DataSync is required for Sage 200 UK only if you want to consolidate the data into one database so that Reports from different Companies can be viewed into the same one.

Important

Skip this section if you prefer to perform the query on the ERP database that uses Live Data, instead of on its copy.

Prerequisites

  • A version of Sage 200 UK supported by Sage
  • A destination database (preferably, in the same collation as your Sage 200 UK Database)
  • A valid version of DataSync (refer to Installing DataSync for more information)

Set the Connections

Important

Each database you want to consolidate needs two Source Connections:

  • One with the Tracking Type set to Date
  • One with the Tracking Type set to None
Note

Unfortunately, not all Sage 200 UK Tables contain the DateUpdated field, and for this reason, some Tables cannot be loaded incrementally. Fortunately, these Tables are often very small and do not pose a problem.

Important
  • Time Zone has to be set to the Sage 200 UK Application Server Time.
Example

You should have a result similar to this for the Source connection:

Example

You should have a result similar to this for the Destination connection:

Importing the Extractions

This feature allows you to import a pre-defined template or to restore a backup that you may have done yourself with the Export feature (refer to Export an Extraction for more details).

Some pre-defined templates are already available; if you don't have access to them, please contact your partner. An example of a pre-defined template you could use is the one which defines the list of tables and fields to be synchronized to send Sage 300, X3, Acumatica, Salesforce data to the Cloud.

Important

An Extraction per Source connection you created is required to retrieve data properly.

  1. Click on one of the extractions in the list then on the Import icon located on the upper right-hand corner.
  2. In the Import Extraction window, click on the Choose a zip file hyperlink to browse to the location you saved the export .zip file or drag it directly into that window and click on Next.
Note

For Sage 200 UK, four zip files will be provided.

  • If you are doing a consolidation, import the DS_EXTR_[Software-Version]_[Extraction-Version]_Sage 200 UK with refresh DS-CONSO.zip file.
  • If you only have one Company and want to replicate the database, import the DS_EXTR_[Software-Version]_[Extraction-Version]_Sage 200 UK with refresh DS-SYNC.zip file to perform a synchronization.
  1. On the left pane, select the type of extraction you want to perform and click on Next.
Important

For Sage 200 UK, it is required to use CPYID for the Column Name and the Company codes for the Unique Identifier field.

  1. Refer to Setup the Extraction Panel to define the extraction and click on Import to finish the process.
  1. You should have a result similar to this:

The diagram below illustrates how DataSync processes data.

Note

The Extractions window will automatically switch to the Tables window.

Refer to Add a SQL Query if you want to add SQL statements to some tables and Setup the Field Section to customize the fields (add calculation, change destination name etc.)

 

Validating and Building the Extractions

Once your extraction (source, destination connection and their related tables) is set up, the next step is to validate the setting of an extraction, before being able to run it.

The feature will:

  • Ensure that all the tables/fields exist in the source connections,
  • Validate all SQL queries or calculated fields,
  • Ensure that the data integrity in the destination connection is not affected (ex: change the table structure).

To do so:

  1. Select the extraction you want to validate and build in the list and click on the Validate and Build icon.
  2. In the new window, choose the action which best fits your needs and click on Build (Validate for Migration and Export extraction types).
Note

The choice will be different accordingly to the extraction type you select.

Example

For Synchronization / Consolidation and extraction types:

For Migration and Export extraction types:

  1. Wait for the process to be done.
Note

A Validation report window will appear to give you a quick overview on the process once it's finished. The results are displayed in the Status column and if there is an error, you will get more details by clicking on the hyperlink in the Error column which leads to Log Page.

Running the Extractions

Once your data have been validated (refer to Validate and Build an Extraction for more details), you can manually run the extraction if you want an immediate result instead of scheduling it.

  1. Select the extraction you want to run in the list and click on the Run Extraction Now icon.
  2. In the upper-right hand corner, choose the action you want to execute and the table(s) then click on Run.
Example


Note

Load (for the Migration extraction type only): Loads all data in your destination from your source.

Truncate and Load: Replaces all data in your destination with the current data from your source.

Incremental Load: Retrieves only records that have changed since your last Incremental Load and replace their corresponding records in your destination with the updated ones.

Process Deleted Records: Maximum quantity of days for the validation process to check if records have been deleted based on the last changed date. i.e. If value is set to 30 days, the system will check all the transactions that were created or updated in the last 30 days and then validate if they still exist in the source. If they don't exist anymore in the source, they will be then deleted from the destination.

  1. Wait for the process to be done.
Note

When the process is finished, the results are displayed in the Status column. If there is an error, you can view more details on it by clicking on the hyperlink in the Error column, which leads to the Log Page.

Data Source Configuration

Environments and Data Sources

Tip

The description given to a Data Source created for the first time is used throughout the environments to describe this specific Data Source.

Give a generic description for the first time (e.g. ERP Data Source, Cube Data Source) and if necessary, rename it after the first environment has been created.

The following information is needed to configure the Data Sources:

  • Database server credentials: Server name, Instance, Authentication strategy.
  • Main ERP database information: Database and schema name.

ERP Data Source

  1. In the upper-right hand corner, click on the to access the Administration section.
  2. On the left pane, select Env. & Data Sources.
  3. By default, there is already an environment called Production, which you can rename by double-clicking in the name box. Once changed, press the Enter key.
  4. In the Data Sources section, click on Add to create the first Data Source.
  5. Complete the ERP Data Source configuration. See instructions for MS SQL Server below.

Important

If you are not using DataSync, the Custom Schema Name must be placed at the top of the list. If this is not done, the Data Models for the UDM configuration will generate an error.

Datasource description:
If you use DataSync:
Sage 200 UK DATASYNC
If you don't use DataSync:
Sage 200 UK VIEWS
For UDM Template:
Sage 200 UK TABLES
Type:
SQLSERVER
Server:
Database server of Sage 200 UK
Database name:
Name of the Sage 200 UK database (beware of the case sensitivity)
Database schema name:
Create the two following entries by clicking on the icon (replace DatabaseName by the appropriate value):
If you use DataSync:
DatabaseName.DestinationSchemaInDataSync
DatabaseName.NEC_CUSTOM_SCHEMA
If you don't use DataSync:
DatabaseName.NEC_CUSTOM_SCHEMA
DatabaseName.dbo
For UDM Template:
DatabaseName.dbo
DatabaseName.NEC_CUSTOM_SCHEMA
Note

This second line contains the NectariCustom Schema.

You can use a different one, but we highly recommend following this naming convention:

  • Start with NEC
  • Use all capitals
  • Separate words by an underscore
Important

Choose a unique Custom Schema name for each Environment.

Nectari schema:
Enter the chosen Nectari custom schema for the current environment
Authentication stategy:
UseSpecific
User Name:
SQL User accessing the Sage 200 UK database. For example, sa.
Password:
The user's password.
  1. Click on Validate then on Save to complete the configuration of the Data Source.

Cube Data Source

In the same Environment as the ERP Data Source, create a new Data Source for the OLAP Cube.

Complete the Data Source Definition with all the appropriate information.

The screenshot below provides an example of this.

Server:
Database server where the Nectari OLAP For SQL Server package is installed.
Database name:
NectariCube.
Database schema name:
NectariCube.NEC_FOLDER (replace FOLDER by the folder name).
Where NEC_FOLDER (replace FOLDER by the folder name) is the schema used in the ERP Database of the same environment.
Nectari schema:
Enter the chosen custom schema for the current environment
  • Click on Validate then on Save.
  • Click on Set as Data Warehouse to define the Data Source as a Data Warehouse then enter the following information:
Database warehouse schema:
Enter the chosen Nectari custom schema again.
Use MARS during the cube loading:
Unchecked
  • Click on Validate then on Save.
Tip

Refer to Environments and Data Sources for more details about the MARS option .

Importing Templates

For each environment, the following previously configured information will be required:

  • ERP Database Name
  • Nectari Custom Schema
  • ERP Schema

Download the Template file: TPL_9.5.XXX_Sage200.zip.

The X represents the build number of the template (use the highest available).

Running the Import Template

  1. In the upper-right hand corner, click on the to access the Administration section.
  2. In the Administration section, click on the Templates drop-down menu in the left pane.
  3. Select Import Template.
  4. Choose the specific location where the new templates will be installed and click on Next.
    Note

    Usually, the Root folder is used.

  1. In the Import Template window, click on Select files....
  2. Find the folder where you saved the Template.zip file in order to select it then click on Open.
  3. In the Data Sources Mapping screen, associate the Data Sources (ERP) listed in the Received Data Sources Description column (those from the template) with the Data Sources you previously defined in the Central Point (listed in the Current Data Sources Description column)
    • In the Received Data Sources Description column, ensure that only the Data Sources checkboxes you want to use from the template are ticked off.
    • In the Current Data Sources Description column, click on Bind a Data Source to access the drop-down list containing the existing Data Sources and click on Next.

In the next screen all of the Templates content is displayed, against what the Central Point already has.

By default, on the first install, everything will be set to Add (leave everything by default) .

  • In the case of a first installation, the first four columns will display None and Never Installed, the next three will detail the Template content, and the last three gives you the choice to Add, Update or Skip during the installation.
    Note

    In the case of an update, you can refer to Updating template for more details.

  1. Click on Next (this can take time).
  1. Once this has been completed, a window will be prompted to input the necessary parameters to create the custom objects.
  2. If more than one Environment have been created, you will see a column per Environment. You can untick an Environment checkbox, in which case the Global Scripts will not run in it.


  1. Complete the parameters, see examples below, and click on Next.
  1. After importing, an Execution Report will be produced, as shown below.
    Note

    The first section is for the ERP Data Source and the one below it is for the Cube Data Source.

    You can click on the button to see the details of each script individually. If no failures are reported, close the window.

  1. If any of the scripts failed to run, a fail icon will be displayed. Click on the fail symbol to view the Report Preview window, which shows the respective SQL script.
  • Copy this script, debug, and run it separately if needed. Users who are proficient with SQL can debug it straight in the Report Preview window and run it by clicking on the Try to rerun button.

Updating template

Important

Some considerations you must take into account before starting:

  • Making fresh backups of both the Nectari database and Central Point before doing a template update is highly recommended.
  • Check the Nectari Data Models and Nectari custom SQL objects that may have been delivered with the initial template version, as you might lose these customizations upon updating.
  • You must have a template version that matches the software version installed. If you are using Nectari 9, the template should be also 9.

When performing an upgrade of the Nectari software, it will only update the software and not the template. In other words, the existing Nectari Data Models and Views won't be affected.

Note

After a software upgrade, it is not mandatory to systematically perform a template update. A template update is useful if you have encountered problems with specific Nectari Data Models or Nectari custom SQL objects as it includes fixes.

To update a template:

  1. After having mapped the Data sources, tick the checkboxes of the objects you want to upgrade and click on Next.
    Note

    By default, no checkbox in the Update column will be ticked. If there is a new Data Model / View the Add checkbox will be ticked. Select Skip if you want to ignore it.

    Important

    If you tick the Update checkbox, it will overwrite the existing Nectari objects associated with that Data Model or connected to the others (dependencies). Please note that if any customizations have been done, they will be lost.

  1. Select the environment in which the scripts will be executed and click on Next.
  2. Complete the parameters and click on Next.
  3. In the Execution report window, If any of the scripts failed to run, a fail icon will be displayed. Click on the fail symbol to view the Report Preview window, which shows the respective SQL script.
  4. Copy this script, debug, and run it separately if needed. Users who are proficient with SQL can debug it straight in the Report Preview window and run it by clicking on the Try to rerun button.
Important

Web Browsers have updated their policy regarding Cookies and these changes must be applied to your Web Client if you want Nectari embedded into your ERP website, or use Single Sign-On (SSO). Refer to Cookie Management for more details.

Using Excel Add-In with Sage 200 UK

This section helps you configure Excel Add-In so it can be used with the Sage 200 UK template provided with the SEI Financials for Sage 200.xlsx file.

  1. Open Excel and follow the procedure described in Login to Nectari Excel Add-in to login.
  2. Follow the procedure described in Environment Configurations to select the Environment based on your configuration as indicated.
Note

Select Sync if you use DataSync and Live if you don't.

  1. Follow the procedure described in Data Model Configurations to map the right Data extraction to the right Reference Data Model and Data Model as indicated.
  2. Fill in out the fields in the Main tab and click on Refresh Formulas to retrieve the data.