Difference between revisions of "SQL Organisation Import"

From Hornbill
Jump to navigation Jump to search
 
(32 intermediate revisions by 3 users not shown)
Line 5: Line 5:
 
==Open Source==
 
==Open Source==
  
The Hornbill Cleaner Utility is provided open source under the [https://wiki.hornbill.com/index.php/The_Hornbill_Community_License_(HCL) Hornbill Community Licence] and can be found [https://github.com/hornbill/goHornbillCleaner Here] on GitHub
+
The Hornbill SQL Organisation Import Utility is provided open source under the [https://wiki.hornbill.com/index.php/The_Hornbill_Community_License_(HCL) Hornbill Community Licence] and can be found [https://github.com/hornbill/goHornbillOrgImport Here] on GitHub
  
 
==Installation Overview==
 
==Installation Overview==
Line 11: Line 11:
 
===Windows Installation===
 
===Windows Installation===
  
* Download the [https://github.com/hornbill/goHornbillCleaner/releases/latest ZIP archive] relevant to your OS and architecture. This contains the cleaner executable, configuration file and license;
+
* Download the [https://github.com/hornbill/goHornbillOrgImport/releases/latest ZIP archive] relevant to your OS and architecture. This contains the tool executable file, configuration file and license;
 
* Extract the ZIP archive into a folder you would like the application to run from e.g. 'C:\organisation_import\'.
 
* Extract the ZIP archive into a folder you would like the application to run from e.g. 'C:\organisation_import\'.
  
==Configuration Overview==
+
 
 +
=Configuration Overview=
  
 
The configuration of this utility is managed through a JSON file (conf.json), which is supplied with each release:
 
The configuration of this utility is managed through a JSON file (conf.json), which is supplied with each release:
Line 28: Line 29:
 
         "UserName": "",
 
         "UserName": "",
 
         "Password": "",
 
         "Password": "",
         "Port": ,
+
         "Port": 0,
         "Encrypt": true/false,
+
         "Encrypt": false,
 
         "OrganizationName": "FIELD 1",
 
         "OrganizationName": "FIELD 1",
         "Query": ""
+
         "Query": "SELECT <field_list> FROM <table_or_file>"
 
     },
 
     },
 
     "OrganizationMapping":{
 
     "OrganizationMapping":{
Line 40: Line 41:
 
         "postal_code":"FIELD 5",
 
         "postal_code":"FIELD 5",
 
         "country":"FIELD 6",
 
         "country":"FIELD 6",
         "industry":"FIELD 7}",
+
         "industry":"FIELD 7",
 
         "phone_number":"FIELD 8",
 
         "phone_number":"FIELD 8",
 
         "website":"FIELD 9",
 
         "website":"FIELD 9",
+
        "language":"FIELD 10",
 +
        "custom_1":"FIELD 11",
 +
        "custom_2":"FIELD 12",
 +
        "custom_3":"FIELD 13",
 +
        "custom_4":"FIELD 14",
 +
        "custom_5":"FIELD 15",
 +
        "custom_6":"FIELD 16",
 +
        "custom_7":"FIELD 17",
 +
        "custom_8":"FIELD 18",
 +
        "custom_9":"FIELD 19",
 +
        "custom_10":"FIELD 20",
 +
        "custom_11":"FIELD 21"
 
     }
 
     }
 
</code>
 
</code>
  
  
 +
*'''APIKey''' - a Hornbill API key for a user account with the correct permissions to carry out all of the required API calls
 +
*'''InstanceId''' - Hornbil instance ID. You can get the instance ID from your URL used to access Hornbill: <nowiki>https://live.hornbill.com/your_instance_id/</nowiki>. This value is case sensitive.
 +
*'''OrganisationAction''' - one of the following values:
 +
**'''Create''' = Import will only create new organisations in your instance.
 +
***If the imported organisation already exists and this is set to "Create" then the organisation will not be created
 +
**'''Update''' = Import will only update existing organisations in your instance.
 +
***If the imported organisation does not exist and this is set to "Update" then the organisation will not be updated
 +
**'''Both''' = Import will create new organisations and update existing organisations in your instance.
 +
***If the imported organisation does not exist and this is set to "Both" then the organisation will be created
 +
***If the imported organisation does exist and this is set to "Both" then the organisation will be updated
  
  
 +
'''SQLConf'''
  
* CleanRequests : Set to true to remove all Service Manager Requests (and related entity data) from a Hornbill instance. Filter the requests to be deleted using the following parameters
+
*'''Driver''' - the driver to use to connect to the database that holds the asset information. Needs to be one of the following:
** RequestServices : An array containing Service ID Integers to filter the requests for deletion against. An empty array will remove the Service filter, meaning requests with any or no service associated will be deleted
+
**'''mssql''' = Microsoft SQL Server (2005 or above)
** RequestStatuses : An array containing Status strings to filter the requests for deletion against. An empty array will remove the Status filter, meaning requests at any status will be deleted
+
**'''mysql''' = MySQL Server 4.1+, MariaDB
** RequestTypes : An array containing Request Type strings to filter the requests for deletion against. An empty array will remove the Type filter, meaning requests of any Type will be deleted
+
**'''mysql320''' = MySQL Server v3.2.0 to v4.0
** RequestLogDateFrom :  A date to filter requests against log date (requests logged after or equal to this date/time). Can take one of the following values:
+
**'''csv''' = ODBC Data Source using MS Access Text Driver (*.txt, *.csv)
*** An empty string will remove the Logged From filter.
+
**'''excel''' = ODBC Data Source using MS Excel Driver (*.xls, *.xlsx, *.xlsm, *.xlsb)
*** A date string in the format YYYY-MM-DD HH:MM:SS.
 
*** A duration string, to calculate a new datetime from the current datetime:
 
**** Example: <code>-P1D2H3M4S</code> This will subtract 1 day (1D), 2 hours (2H), 3 minutes (3H) and 4 seconds (4S) from the current date & time.
 
**** See the CalculateTimeDuration function documentation in https://github.com/hornbill/goHornbillHelpers for more details
 
** RequestLogDateTo : A date to filter requests against log date (requests logged before or equal to this date/time). Can take one of the following values:
 
*** An empty string will remove the Logged Before filter.  
 
*** A date string in the format YYYY-MM-DD HH:MM:SS.  
 
*** A duration string, to calculate a new datetime from the current datetime:
 
**** Example: <code>-P1D2H3M4S</code> This will subtract 1 day (1D), 2 hours (2H), 3 minutes (3H) and 4 seconds (4S) from the current date & time.
 
**** See the CalculateTimeDuration function documentation in https://github.com/hornbill/goHornbillHelpers for more details
 
** RequestClosedDateFrom :  A date to filter requests against close date (requests closed after or equal to this date/time). Can take one of the following values:
 
*** An empty string will remove the closed From filter.
 
*** A date string in the format YYYY-MM-DD HH:MM:SS.
 
*** A duration string, to calculate a new datetime from the current datetime:
 
**** Example: -P1D2H3M4S - This will subtract 1 day (1D), 2 hours (2H), 3 minutes (3H) and 4 seconds (4S) from the current date & time.
 
**** See the CalculateTimeDuration function documentation in <https://github.com/hornbill/goHornbillHelpers> for more details
 
** RequestClosedDateTo : A date to filter requests against close date (requests closed before or equal to this date/time). Can take one of the following values:
 
*** An empty string will remove the Closed Before filter.
 
*** A date string in the format YYYY-MM-DD HH:MM:SS.
 
*** A duration string, to calculate a new datetime from the current datetime:
 
**** Example: -P1D2H3M4S - This will subtract 1 day (1D), 2 hours (2H), 3 minutes (3H) and 4 seconds (4S) from the current date & time.
 
**** See the CalculateTimeDuration function documentation in <https://github.com/hornbill/goHornbillHelpers> for more details
 
** RequestReferences : An array of Request References to delete. If requests are defined in this array, then ONLY these requests will be deleted. The other parameters above will be ignored. In the example above, requests with reference CHR00000021 and INC00000003 would be deleted, and no other requests would be removed.
 
<br>
 
* CleanAssets : Set to true to remove all Assets (and related entity data) from a Hornbill instance
 
* AssetClassID: Filter assets for deletetion by a single asset class ID (basic, computer, computerPeripheral, mobileDevice, printer, software, telecoms)
 
* AssetTypeID: Filter assets for deletion by a single asset type ID - the primary key value of the asset type from the database. Can also be found in the URL when viewing an asset type, 18 in this example: https://live.hornbill.com/yourinstanceid/servicemanager/asset/type/view/18/
 
<br>
 
* CleanUsers : Set to true to remove all Users listed in the Users array
 
** Users : Array of strings, contains a list of User IDs to remove from your Hornbill instance
 
  
==Running the utility==
 
  
When you are ready to clear-down your request, asset or user records:
+
When using mssql, mysql or mysql320 drivers, the '''Server''', '''Database''', '''UserName''', '''Password''' and '''Port''' parameters should be populated accordingly:
 +
*'''Server''' = the address of the SQL server (e.g. localhost)
 +
*'''Database''' = the name of the Database to connect to
 +
*'''UserName''' = this should be the SQL authentication Username to connect to the Database (if any, e.g. root)
 +
*'''Password''' = this should be the password for the above username (if any)
 +
*'''Port''' = the SQL port (e.g. 5002)
 +
  
# Open '''conf.json''' and add in the necessary configuration;
+
When using csv or excel as a data source, the '''Database''' parameter should be populated accordingly:
# On Windows:
+
*'''Database''' = this should be populated with the  name of the ODBC connection (the name of the Data Source - DSN) on the PC that is running the tool
## Open a Command Line Prompt as Administrator;
 
# On OSX or Linux:
 
## Open a Terminal;
 
# Change Directory to the folder containing the cleaner executable and configuration files E.g. 'C:\hornbill_cleaner\' or '/Users/YourUserID/hornbillCleaner/';
 
# Run the appropriate command:
 
#:: Windows: <code>hornbillCleaner.exe -instance=yourinstancename -apikey=yourapikey</code>
 
#:: OSX or Linux Terminal: <code>./hornbillCleaner -instance=yourinstancename -apikey=yourapikey</code>
 
<br>
 
'''PLEASE NOTE''':
 
* Follow all on-screen prompts, taking care to read all messages provided.
 
* If any errors are experienced, first review the Trouble Shooting section below. If you are unable to find the answer there, please seek further advice on the [https://forums.hornbill.com Hornbill Forums]
 
  
==Post Utility Actions==
 
After the Hornbill Clean Utility has been run successfully, you can reset your request reference number or asset ID back to 0:
 
  
===Resetting Application Auto Values===
+
In addition, when using any of the available drivers, the '''Encrypt''', '''OrganizationName''' and '''Query''' parameters should also be populated accordingly:
 +
*'''Query''' = this should be the SQL query to retrieve the organisation records (e.g. SELECT * FROM <datbase_table_or_file>)
 +
*'''OrganizationName''' = specify which field from your data source contains the organisation name
 +
*'''Encrypt''' = boolean value to specify whether the connection between the script and the database should be encrypted.
  
After the Hornbill Clean utility has been run successfully, you may like to reset the relevant application Auto Values such as the request reference number. This is not essential, more a personal preference.
+
''NOTE'': There is a bug in SQL Server 2008 and below that causes the connection to fail if the connection is encrypted. Only set this to true if your SQL Server has been patched accordingly.
<br>
 
The Application Auto Values can be reset via '''''Hornbill Administration > Home > System > Data > Auto Values'''''.
 
* itsmRequestsAutoId - Request Reference Number
 
* itsmAssetAutoId - Asset Auto ID
 
  
Click the "Reset Counter" button to reset the Auto Value
 
  
==Command Line Parameters==
+
'''OrganizationMapping'''
  
* -instance - This should be the ID of your instance
+
*Maps data from your data source into the generic Hornbill organisation record
* -apikey - This should be an API of a user on your instance that has the correct rights to perform the search & deletion of the specified records
+
*Any value wrapped with "" will be populated with the corresponding response from the SQL Query and is treated literally as a written example.
* -file - This is the name of the Configuration file to load. If this parameter is not specified, by default the utility will look for `conf.json'
 
* -blocksize ''x'' - Where '''x''' is the number of records that should be retrieved and deleted as "blocks". If this parameter is not specified, the default is 3, and under normal circumstances this should not need to be overridden.
 
* -dryrun - Requires Service Manager build >= 1392 to work with request data. This boolean flag allows a "dry run" to be performed - the tool identifies the primary key for all parent records that would have been deleted, and outputs them to the log file without deleting any records. Defaults to false.
 
* -justrun - This boolean flag allows you to skip the confirmation prompts when the tool is run. This allows the tool to be scheduled, with the correct configuration defined to delete request records over a certain age for example. Defaults to false.
 
  
==Logging Overview==
+
=Set up ODBC Connector for CSV files=
 +
 
 +
To create a 64 bit ODBC connector for CSV files, you need to have '''Microsoft Access Text Driver''' installed. This comes with the Microsoft Access Database Engine which can be found as an [https://www.microsoft.com/en-us/download/details.aspx?id=13255 MS 2010 Redistributable] or as an optional part of the Microsoft Office Suite.
 +
 
 +
==Step 1==
 +
 
 +
* in Windows menu open '''ODBC Data Sources''' app as Administrator (in Windows 10 you can type in "ODBC" in the search box on the taskbar to quickly find the ODBC Data Sources app)
 +
* in ODBC Data Sources app navigate to '''System DSN''' tab
 +
* click on '''Add''' button to "Create a New Data Source". In the list of drivers locate '''Microsoft Access Text Driver (*.txt, *.csv)'''. If this driver is not present in the list then you don't have Microsoft Access Database Engine installed. Please refer to the notes above to find out how to install it
 +
* click on '''Finish''' button to progress to the next step
  
All logging output is saved to the Hornbill instance, in a log called Hornbill_Clean, and can be accessed from within the Hornbill Administration tool.
+
==Step 2==
  
==Trouble Shooting==
+
* type in a name for your Data Source - this name will be used for the "Database" parameter in the JSON config file
 +
* set up the directory/folder where the CSV files are located:
 +
** when configuring the connector for the first time the '''Use Current Directory''' checkbox is selected by default. Unselect this checkbox to enable '''Select Directory''' button and click it. Navigate to the directory/folder where the CSV files are located. select the folder then click '''Ok''' button.
 +
** when editing an existing CSV connector the '''Use Current Directory''' checkbox should be unselected and '''Select Directory''' enabled. If the folder containing the CSV files is already displayed then there is no need to do anything in this step.
 +
* click on '''Options''' button to expand the current window
 +
* in the "Extensions List" select the '''*.csv''' option
 +
* click on '''Define Format''' button to progress to the next step
  
===Common Error Messages===
+
==Step 3==
Below are some common errors that you may encounter in the log file and what they mean:
 
* ''' ''SQL Query was unsuccessful'' ''' - this occurs when the utility is prevented from running the query to obtain the list of requests to delete. This happens when the system setting ''security.database.allowSqlQueryOperation'' is set to false (OFF). To successfully run the utility, this setting must be set to true (ON). It can be found in Hornbill Administration > System > Settings > Advanced, and filter on "security".
 
  
=== Error Codes ===
+
* if the directory/folder has been configured correctly in the previous step you should see the CSV file(s) in the '''Tables''' list. Select the file you need to use for the import.
* '''100''' - Unable to create log File
+
* if you have column headings in your CSV file activate '''Column Name Header''' checkbox
* '''101''' - Unable to create log folder
+
* amend '''Rows to Scan''' option depending on how many records you are importing. By default, this is set to 25, if you are importing more then it needs to be increased. It does not have to be an exact number but it needs to be sufficient to ensure all records from your file are retrieved
* '''102''' - Unable to Load Configuration File
+
* set '''Characters''' option to ANSI
 +
* on the left hand side there is a '''Columns''' list. By default, in the ODBC data source we are configuring the columns in the file are set to FIELD1, FIELD2, etc. You can leave them as they are but you need to ensure the columns in this list matches the columns in the CSV file or you can customise them. This does not affect the CSV file in any form, this is only for the ODBC Data Source configuration.
 +
**Note: when having column headers in the CSV file, using the '''Guess''' button will attempt to read the column headers and customise the columns in the list accordingly. This is not a failproof action (hence the name of the button) therefore you need to ensure the columns are configured correctly if the Guess button is used:
 +
*** '''Data Type''' = should always be '''Char''' unless there is a need to be another type
 +
*** '''Name''' = should be a valid name. Please note that on occasion invalid characters are introduced in the column name (e.g. ). You need to remove these characters and then click on '''Modify''' button to save the changes. Invalid characters in column name can cause the tool to fail to import the records.
 +
* click on '''Ok''' buttons all the way back to '''System DSN''' tab to finish the configuration
  
== HTTP Proxies ==
+
Note: when importing from multiple files, repeat this step for all the files you need to import
  
If you use a proxy for all of your internet traffic, the HTTP_PROXY Environment variable needs to be set. The https_proxy environment variable holds the hostname or IP address of your proxy server. It is a standard environment variable and like any such variable, the specific steps you use to set it depends on your operating system.
+
If the configuration is complete then in the directory/folder selected during configuration there should now be a file named '''schema.ini'''. This file stores the configuration performed in the above steps. If this file is missing then the configuration was not completed successfully. This files stores the configuration for any file in the respective directory /folder therefore when defining a format for another file, the configuration is appended into this file rather than creating a separate schema file.
  
For windows machines, it can be set from the command line using the following:
 
<br>
 
<code>
 
set HTTP_PROXY=HOST:PORT
 
</code>
 
<br>
 
Where "HOST" is the IP address or host name of your Proxy Server and "PORT" is the specific port number.
 
  
<div class="mw-collapsible mw-collapsed" data-collapsetext="Show Less" data-expandtext="Read More" style="width:1050px">
+
== Execute ==
==Change Log==
 
Click "Read More" to view the Change Log.
 
<div class="mw-collapsible-content">
 
  
* 1.11.0 (September 27th, 2019)
+
Command Line Parameters:
** Features
 
*** Added support to clean-up Configuration Manager Dependency, Impact and Policy records when Assets are deleted
 
*** Refactored to remove duplicate code
 
*** Improved log output consistency
 
  
* 1.10.0 (July 12th, 2019)
+
* '''file''' - this should point to your JSON configuration file and by default looks for a file in the current working directory called conf.json. If this is present you don't need to have the parameter.
** Features
+
* '''dryrun''' - if set to True the XMLMC for Create and Update organisations will not be called and instead the XML will be dumped to the log file, so you can ensure the data mappings are correct before running the import. Defaults to `false`.
*** Added support to filter assets for deletion by class and type
 
  
* 1.9.0 (March 15th, 2019)
+
'goOrganisationImport_x64.exe -file=conf.json'
** Features
 
*** Replaced large data set calls to entityBrowseRecords2 with paginated queries
 
*** Removed calls to sysOptionGet and sysOptionSet
 
*** Added support to remove additional Request BPM timers and events
 
*** Added support to remove Request SLM events
 
  
* 1.8.2 (February 1st, 2019)
 
** Defect Fix
 
*** Fixed issue with AssetsLinks records not being cleared
 
  
* 1.8.1 (January 28th, 2019)
+
== Preparing to run the tool ==
** Defect Fix
 
*** Fixed issue with RequestClosedDateFrom and RequestClosedDateTo returning a query error
 
  
* 1.8.0 (December 18th, 2018)
+
* Open '''conf.json''' and add in the necessary configuration;
** Features
+
* Open Command Line Prompt as Administrator;
*** Added new filters for requests for deletion:
+
* Change directory to the folder with goOrganisationImport_x64 *.* executables 'C:\organisation_import
*** RequestClosedDateFrom : Delete requests that were closed on or after this date/time
+
** On 32 bit Windows PCs: goOrganisationImport_x86.exe
*** RequestClosedDateTo: Delete requests that were closed before or at this date/time
+
** On 64 bit Windows PCs: goOrganisationImport_x64.exe
 +
* Follow all on-screen prompts, taking careful note of all prompts and messages provided.
  
* 1.7.0 (November 29th, 2018)
 
** Features
 
*** Added a "dryrun" CLI input parameter, which when enabled allows users to run the tool without deleting any records, and the primary key for each record that would have been deleted is output to logs for review. Requires Service Manager build 1392 or above to work with request data
 
*** Added a "justrun" CLI input parameter, which when enabled will skip the initial "do you want to delete the data" prompts. This allows the tool to be run on a schedule
 
*** Improved logging output:
 
**** Logs everything on both client and server side now, instead of just server side, to aid in the reviewing of logs
 
**** At the start of the log, all configuration options (the CLI params AND all options & filters from the config JSON) are now logged for auditing purposes
 
  
* 1.6.0 (November 1st, 2018)
+
= API Key Rules =
** Features
+
This utility uses ([[API keys]]):
*** Added support to delete Board Manager cards when parent Requests are deleted, if Board Manager is installed on the instance at build 100 or greater
 
*** Improved logging output
 
  
* 1.5.1 (October 10th, 2018)
+
* data:entityAddRecord
** Features:
+
* data:entityBrowseRecords2
*** Outputs relevant error message if the instance ID could not be resolved
+
* data:entityUpdateRecord
*** Improved performance and better sharing of HTTP sessions
+
* system:logMessage
** Defect fixes:
 
*** Fixed memory leak
 
  
* 1.5.0 (September 9th, 2018)
+
=Troubleshooting=
** Features:
 
*** Added support for supplying a duration string, as well as the existing hard-coded datetime string in the RequestLogDateFrom and RequestLogDateTo configuration parameters. This allows for the calculation of datetimes using the runtime datetime.
 
*** General tidy-up of the code, split code in to separate Go files for ease of maintenance
 
  
* 1.4.0 (June 4th, 2018)
+
==Logging Overview==
** Feature:
 
*** Replaced Username & Password session authentication with API key
 
*** Replaced stored username, password and instance URL with command line inputs for instance ID and API Key
 
  
* 1.3.0 (February 1st, 2018)
+
All Logging output is saved in the log directory in the same directory as the executable the file name contains the date and time the import was run 'SQL_Organization_Import_YYYYMMDDHHMMSS.log'
** Features:
 
*** When requests are being deleted, any asset links records are now also deleted.
 
*** Added ability to delete User records
 
  
* 1.2.0 (November 24th, 2017)
+
==Common Error Messages==
** Feature:
 
*** Added ability to delete specific requests using their reference numbers.
 
  
* 1.1.0 (September 1st, 2017)
+
Below are some common errors that you may encounter in the log file and what they mean:
** Feature:
+
* '' '''[ERROR] Error Decoding Configuration File:.....''' '' - this will be typically due to a missing quote (") or comma (,) somewhere in the configuration file. This is where an online JSON viewer/validator can come in handy rather than trawling the conf file looking for that proverbial needle in a haystack.
*** Requests to be deleted can now be filtered by:
 
**** Multiple Service IDs
 
**** Multiple Statuses
 
**** Multiple Types
 
**** Requests logged after a specific date & time
 
**** Requests logged before a specific date & time
 
*** NOTE - this version requires Hornbill Service Manager Update 1048 or above.
 
  
* v1.0.6 - 17/07/2017
+
=== Error Codes ===
** Feature:
+
* '''100''' - Unable to create log File
*** Now supports the deletion of Asset CMDB links when clearing down asset records
+
* '''101''' - Unable to create log folder
 +
* '''102''' - Unable to Load Configuration File
  
* v1.0.5 - 01/02/2017
 
** Defect Fix:
 
*** Changed the order in which request extended information is deleted, so that workflow tasks can be deleted successfully
 
  
* v1.0.4 - 12/01/2017
+
= HTTP Proxies =
** Features:
+
{{UtilityProxies}}
** Added code to process the deletion of:
 
*** Request Workflow instances
 
*** Request Activities
 
*** Request timer events
 
  
* v1.0.3 - 03/08/2016
+
=Scheduling Overview=
** Features:
 
*** Added parameter within configuration file, to specify class of requests to delete 
 
*** Improved performance of request deletion
 
*** Improved error output to display
 
  
* v1.0.2 - 08/06/2016
+
==Windows==
** Features:
+
You can schedule goHornbillOrgImport.exe to run with any optional command line argument from Windows Task Scheduler.
*** Reduced record block size default down to 3
 
*** Improved logging output
 
  
* v1.0.1 - 12/05/2016
+
* Ensure the user account running the task has rights to goHornbillOrgImport.exe and the containing folder.
** Features:
+
* Make sure the Start In parameter contains the folder where goHornbillOrgImport.exe resides in otherwise it will not be able to pick up the correct path.
*** Reduced record block size default down to 20
 
*** Added flag to allow default block-size to be overridden at runtime
 
  
* v1.0.0 - 10/03/2016
 
** Initial Release
 
</div>
 
</div>
 
  
[[Category:Administration]]
+
[[Category:Integration]]

Latest revision as of 15:15, 24 September 2021

About the SQL Organisation Import Utility

The utility provides a quick and easy method of uploading and updating organisations from an external data source into Hornbill instance.

Open Source

The Hornbill SQL Organisation Import Utility is provided open source under the Hornbill Community Licence and can be found Here on GitHub

Installation Overview

Windows Installation

  • Download the ZIP archive relevant to your OS and architecture. This contains the tool executable file, configuration file and license;
  • Extract the ZIP archive into a folder you would like the application to run from e.g. 'C:\organisation_import\'.


Configuration Overview

The configuration of this utility is managed through a JSON file (conf.json), which is supplied with each release:

   "APIKey": "your_api_key_here",
   "InstanceId": "your_instance_id_here",
   "OrganisationAction": "Create/Update/Both",
   "SQLConf": {
       "Driver": "",
       "Server": "",
       "Database": "",
       "UserName": "",
       "Password": "",
       "Port": 0,
       "Encrypt": false,
       "OrganizationName": "FIELD 1",
       "Query": "SELECT <field_list> FROM <table_or_file>"
   },
   "OrganizationMapping":{
       "organization_name":"FIELD 1",
       "address":"FIELD 2",
       "city":"FIELD 3",
       "state":"FIELD 4",
       "postal_code":"FIELD 5",
       "country":"FIELD 6",
       "industry":"FIELD 7",
       "phone_number":"FIELD 8",
       "website":"FIELD 9",
       "language":"FIELD 10",
       "custom_1":"FIELD 11",
       "custom_2":"FIELD 12",
       "custom_3":"FIELD 13",
       "custom_4":"FIELD 14",
       "custom_5":"FIELD 15",
       "custom_6":"FIELD 16",
       "custom_7":"FIELD 17",
       "custom_8":"FIELD 18",
       "custom_9":"FIELD 19",
       "custom_10":"FIELD 20",
       "custom_11":"FIELD 21"
   }


  • APIKey - a Hornbill API key for a user account with the correct permissions to carry out all of the required API calls
  • InstanceId - Hornbil instance ID. You can get the instance ID from your URL used to access Hornbill: https://live.hornbill.com/your_instance_id/. This value is case sensitive.
  • OrganisationAction - one of the following values:
    • Create = Import will only create new organisations in your instance.
      • If the imported organisation already exists and this is set to "Create" then the organisation will not be created
    • Update = Import will only update existing organisations in your instance.
      • If the imported organisation does not exist and this is set to "Update" then the organisation will not be updated
    • Both = Import will create new organisations and update existing organisations in your instance.
      • If the imported organisation does not exist and this is set to "Both" then the organisation will be created
      • If the imported organisation does exist and this is set to "Both" then the organisation will be updated


SQLConf

  • Driver - the driver to use to connect to the database that holds the asset information. Needs to be one of the following:
    • mssql = Microsoft SQL Server (2005 or above)
    • mysql = MySQL Server 4.1+, MariaDB
    • mysql320 = MySQL Server v3.2.0 to v4.0
    • csv = ODBC Data Source using MS Access Text Driver (*.txt, *.csv)
    • excel = ODBC Data Source using MS Excel Driver (*.xls, *.xlsx, *.xlsm, *.xlsb)


When using mssql, mysql or mysql320 drivers, the Server, Database, UserName, Password and Port parameters should be populated accordingly:

  • Server = the address of the SQL server (e.g. localhost)
  • Database = the name of the Database to connect to
  • UserName = this should be the SQL authentication Username to connect to the Database (if any, e.g. root)
  • Password = this should be the password for the above username (if any)
  • Port = the SQL port (e.g. 5002)


When using csv or excel as a data source, the Database parameter should be populated accordingly:

  • Database = this should be populated with the name of the ODBC connection (the name of the Data Source - DSN) on the PC that is running the tool


In addition, when using any of the available drivers, the Encrypt, OrganizationName and Query parameters should also be populated accordingly:

  • Query = this should be the SQL query to retrieve the organisation records (e.g. SELECT * FROM <datbase_table_or_file>)
  • OrganizationName = specify which field from your data source contains the organisation name
  • Encrypt = boolean value to specify whether the connection between the script and the database should be encrypted.

NOTE: There is a bug in SQL Server 2008 and below that causes the connection to fail if the connection is encrypted. Only set this to true if your SQL Server has been patched accordingly.


OrganizationMapping

  • Maps data from your data source into the generic Hornbill organisation record
  • Any value wrapped with "" will be populated with the corresponding response from the SQL Query and is treated literally as a written example.

Set up ODBC Connector for CSV files

To create a 64 bit ODBC connector for CSV files, you need to have Microsoft Access Text Driver installed. This comes with the Microsoft Access Database Engine which can be found as an MS 2010 Redistributable or as an optional part of the Microsoft Office Suite.

Step 1

  • in Windows menu open ODBC Data Sources app as Administrator (in Windows 10 you can type in "ODBC" in the search box on the taskbar to quickly find the ODBC Data Sources app)
  • in ODBC Data Sources app navigate to System DSN tab
  • click on Add button to "Create a New Data Source". In the list of drivers locate Microsoft Access Text Driver (*.txt, *.csv). If this driver is not present in the list then you don't have Microsoft Access Database Engine installed. Please refer to the notes above to find out how to install it
  • click on Finish button to progress to the next step

Step 2

  • type in a name for your Data Source - this name will be used for the "Database" parameter in the JSON config file
  • set up the directory/folder where the CSV files are located:
    • when configuring the connector for the first time the Use Current Directory checkbox is selected by default. Unselect this checkbox to enable Select Directory button and click it. Navigate to the directory/folder where the CSV files are located. select the folder then click Ok button.
    • when editing an existing CSV connector the Use Current Directory checkbox should be unselected and Select Directory enabled. If the folder containing the CSV files is already displayed then there is no need to do anything in this step.
  • click on Options button to expand the current window
  • in the "Extensions List" select the *.csv option
  • click on Define Format button to progress to the next step

Step 3

  • if the directory/folder has been configured correctly in the previous step you should see the CSV file(s) in the Tables list. Select the file you need to use for the import.
  • if you have column headings in your CSV file activate Column Name Header checkbox
  • amend Rows to Scan option depending on how many records you are importing. By default, this is set to 25, if you are importing more then it needs to be increased. It does not have to be an exact number but it needs to be sufficient to ensure all records from your file are retrieved
  • set Characters option to ANSI
  • on the left hand side there is a Columns list. By default, in the ODBC data source we are configuring the columns in the file are set to FIELD1, FIELD2, etc. You can leave them as they are but you need to ensure the columns in this list matches the columns in the CSV file or you can customise them. This does not affect the CSV file in any form, this is only for the ODBC Data Source configuration.
    • Note: when having column headers in the CSV file, using the Guess button will attempt to read the column headers and customise the columns in the list accordingly. This is not a failproof action (hence the name of the button) therefore you need to ensure the columns are configured correctly if the Guess button is used:
      • Data Type = should always be Char unless there is a need to be another type
      • Name = should be a valid name. Please note that on occasion invalid characters are introduced in the column name (e.g. ). You need to remove these characters and then click on Modify button to save the changes. Invalid characters in column name can cause the tool to fail to import the records.
  • click on Ok buttons all the way back to System DSN tab to finish the configuration

Note: when importing from multiple files, repeat this step for all the files you need to import

If the configuration is complete then in the directory/folder selected during configuration there should now be a file named schema.ini. This file stores the configuration performed in the above steps. If this file is missing then the configuration was not completed successfully. This files stores the configuration for any file in the respective directory /folder therefore when defining a format for another file, the configuration is appended into this file rather than creating a separate schema file.


Execute

Command Line Parameters:

  • file - this should point to your JSON configuration file and by default looks for a file in the current working directory called conf.json. If this is present you don't need to have the parameter.
  • dryrun - if set to True the XMLMC for Create and Update organisations will not be called and instead the XML will be dumped to the log file, so you can ensure the data mappings are correct before running the import. Defaults to `false`.

'goOrganisationImport_x64.exe -file=conf.json'


Preparing to run the tool

  • Open conf.json and add in the necessary configuration;
  • Open Command Line Prompt as Administrator;
  • Change directory to the folder with goOrganisationImport_x64 *.* executables 'C:\organisation_import
    • On 32 bit Windows PCs: goOrganisationImport_x86.exe
    • On 64 bit Windows PCs: goOrganisationImport_x64.exe
  • Follow all on-screen prompts, taking careful note of all prompts and messages provided.


API Key Rules

This utility uses (API keys):

  • data:entityAddRecord
  • data:entityBrowseRecords2
  • data:entityUpdateRecord
  • system:logMessage

Troubleshooting

Logging Overview

All Logging output is saved in the log directory in the same directory as the executable the file name contains the date and time the import was run 'SQL_Organization_Import_YYYYMMDDHHMMSS.log'

Common Error Messages

Below are some common errors that you may encounter in the log file and what they mean:

  • [ERROR] Error Decoding Configuration File:..... - this will be typically due to a missing quote (") or comma (,) somewhere in the configuration file. This is where an online JSON viewer/validator can come in handy rather than trawling the conf file looking for that proverbial needle in a haystack.

Error Codes

  • 100 - Unable to create log File
  • 101 - Unable to create log folder
  • 102 - Unable to Load Configuration File


HTTP Proxies

If you use a proxy for all of your internet traffic, the HTTP_PROXY and HTTPS_PROXY Environment variables need to be set. These environment variables hold the hostname or IP address of your proxy server. It is a standard environment variable and like any such variable, the specific steps you use to set it depends on your operating system.

For windows machines, it can be set from the command line using the following:
set HTTP_PROXY=HOST:PORT

set HTTPS_PROXY=HOST:PORT
Where "HOST" is the IP address or host name of your Proxy Server and "PORT" is the specific port number. IF you require a username and password to go through the proxy, the format for the setting is as follows:
set HTTP_PROXY=username:password@HOST:PORT

set HTTPS_PROXY=username:password@HOST:PORT

URLs to White List

Occasionally on top of setting the HTTP_PROXY variable the following URLs need to be white listed to allow access out to our network

Scheduling Overview

Windows

You can schedule goHornbillOrgImport.exe to run with any optional command line argument from Windows Task Scheduler.

  • Ensure the user account running the task has rights to goHornbillOrgImport.exe and the containing folder.
  • Make sure the Start In parameter contains the folder where goHornbillOrgImport.exe resides in otherwise it will not be able to pick up the correct path.