Saturday, May 25, 2013

Batch synchronization between your system and Sugar

Summary
This article presents how to write and set up a batch script that synchronizes data from your system with the data in Sugar.


The use case
You want to synchronize back office data from SAP, Sage or any other system with your CRM. But the original system is a closed software so it is uneasy to write a connector with Sugar.

The solution
A scheduled batch is exporting the updated data from the external system in a CSV file and then a PHP script is making the synchronization with Sugar using the API.

The external system will be the master of the data as this example is a one-way synchronization.


How it works
The synchronization is based on an external key from the master system. If the key is found in Sugar the record is updated, otherwise it is created.

The synchronization will be performed with a PHP script that will be scheduled and run on your server. It will read a CSV file and call the Sugar API. Your Sugar instance might be on site or in any cloud; the only requirement is that the script can reach it (ports: HTTP call 80 or 443 / TCP).

The CSV header (first row) have to be the technical names in Sugar. The external key might be in any column.

All the actions are stored in a human readable log file that is formatted to let you build an automatic process to analyze the results.


Understanding how Sugar is handling bad data
This is important that you fully understand how Sugar is handling bad data.
E.g. a wrong formatted date, a bad record id reference (the account id related to a contact), a bad data type (string instead of number), a wrong field name: Sugar will ignore silently the error and set the field to a null or an empty value. It will not raise an error.


Understanding the script
define('VERBOSE'true);
The script is writing a log file called logs.txt. Set VERBOSE to true will write the log to the output as well.

Configuration
The instance url:
$url 'http://yourdomain.sugarcrm.eu/service/v2/rest.php';
The external id field name:
$extidname 'externalid_c';

The data file name:

$import_file './test_data.csv';


1) Log in
Log in your instance. You must log in with an admin user; otherwise, if you have a private model you are taking the risk that the interface user will not see all the records and fails matching external ids and create duplicated records instead updating records.

2) read the csv file to get all the external ids
The external ids will be stored in the array $extid.

Make a get_entry_list call with the following parameters:
    $parameters = array(
        
'session' => $sessionid,
        
'module_name' => 'Accounts',
        
'query' => "$extidname IN ($extidlst)",
        
'name',
        
'offset' => 0,
        
'select_fields' => array('id', $extidname),
        
'link_name_to_fields_array' => array(),
        
'max_results' => '1000',
        
'deleted' => '0'
    
);

Change the module_name to your target module.

3) import the data
Read the data file and build an array of records. If the external id exists in Sugar then the record will be updated (we retrieve the Sugar id), otherwise it will be created. This is a kind of upsert (see on wikipedia).

Remark
If you run by mistake the script twice, the data quality will not be compromised because the second time all the records will be updated with the same data.

Log file
The log file is split into 2 sections.
The section called "search ext id" shows the data analysis: which record will be inserted or updated.
Then the script is calling the api method set_entries and fills the section "import data" with the result of the upsert.


In the example, there are 2 records in the CSV file. The first one had been updated, the second inserted.


2013-05-24 23:55:34,CMT,###START NEW LOG###
2013-05-24 23:55:39,CMT,## search ext id
2013-05-24 23:55:39,UPDATE,abc124,b5984f13-d762-9708-8f19-519f096d38f4
2013-05-24 23:55:39,INSERT,abc125,Acme 15
2013-05-24 23:55:39,CMT,## import data
2013-05-24 23:55:40,UPDATED,abc124,b5984f13-d762-9708-8f19-519f096d38f4
2013-05-24 23:55:40,INSERTED,abc125,8b236d6a-981d-803b-c83a-519fe1746037

The data file

"externalid_c","name","phone_office","phone_fax","email1","website","billing_address_street","billing_address_postalcode","billing_address_city","billing_address_country"
"abc124","Acme 14","05 55 11 22 32","05 55 11 22 34","contact@acme14.com","www.acme14.com","41 RUE WINSTON CHURCHILL",87000,"LIMOGES","FRANCE"
"abc125","Acme 15","01 40 55 66 77","01 40 55 66 78","contact@acme15.org","www.acme15.org","6 PLACE DE LA MADELEINE",75008,"PARIS","FRANCE"


The complete source code

Download the code on Pastebin http://pastebin.com/RdXYAaQn

<?php
ini_set
("auto_detect_line_endings"true);

define('VERBOSE'true);

function 
addlog($h$s) {
    
$s=date('Y-m-d H:i:s').",$s\n";
    if (
VERBOSE) echo $s;
    
fputs($h$s);
}


//
// 1) LOGIN
//
$handlew = @fopen('logs.txt'"a");
if (!
$handlew) die("cannot write to the log file\n");

addlog($handlew,'CMT,###START NEW LOG###');
$url 'http://yourdomain.sugarcrm.eu/service/v2/rest.php';
$curl curl_init($url); 
curl_setopt($curlCURLOPT_POSTtrue); 
curl_setopt($curlCURLOPT_HEADERfalse); 
curl_setopt($curlCURLOPT_RETURNTRANSFERtrue); 
$parameters =
  array(
    
"user_auth" =>
        array(
        
'user_name' => "will",
        
'password' => md5('will'),
        ),
    
"application_name" => ''
  
);

$json json_encode($parameters); 
$postArgs 'method=login&input_type=json&response_type=json&rest_data=' $json;
curl_setopt($curlCURLOPT_POSTFIELDS$postArgs); 
$response curl_exec($curl); 
$result json_decode($response);
if(!
is_object($result)) { die("Connection error\n"); }
if (isset(
$result->number)) { // Invalid Login
    
addlog($handlew,'ERROR,'.$result->name);
    
fclose($handlew);
    exit;
}
$sessionid $result->id;

$userid=$result->name_value_list->user_id->value;


//
// 2) read CSV - get the external Id
//

$extidname 'externalid_c';
$import_file './test_data.csv';
$separator ',';
$handle = @fopen($import_file"r");
if (!
$handle) die("cannot open the import file $import_file\n");

$extid = array();
$headers1 fgetcsv($handle1000","'"');
$headers2 = array();
for(
$i=0;$i<count($headers1);$i++) $headers2[$headers1[$i]] = $i;
while ((
$a fgetcsv($handle1000","'"')) !== false) {
    if (
count($a) <= 1) continue;
    
array_push($extid"'".$a[$headers2[$extidname]]."'");
}
if (!
feof($handle)) die("Error: unexpected fgets() fail\n");

fclose($handle);

//
// 3) Mapping: select accounts based on the external Id
//

$extidlst implode(","$extid);
if (
$extidlst!="''") {
    
$extidlst implode(","$extid);
    
$parameters = array(
        
'session' => $sessionid,
        
'module_name' => 'Accounts',
        
'query' => "$extidname IN ($extidlst)",
        
'name',
        
'offset' => 0,
        
'select_fields' => array('id'$extidname),
        
'link_name_to_fields_array' => array(),
        
'max_results' => '1000',
        
'deleted' => '0'
    
);
    
$json json_encode($parameters);
    
$postArgs 'method=get_entry_list&input_type=json&response_type=json&rest_data=' $json;
    
curl_setopt($curlCURLOPT_POSTFIELDS$postArgs);
    
$response curl_exec($curl);
    
$result json_decode($response);
    
$externalid2sg = array();
    foreach (
$result->entry_list as $i)
        
$externalid2sg[$i->name_value_list->$extidname->value] = $i->id;
}


//
// 4) Import
//

addlog($handlew,'CMT,## search ext id');
$handle = @fopen($import_file"r");
if (!
$handle) die('cannot open the import file');

$headers = array();
$records = array();
$logs = array();

while ((
$a fgetcsv($handle1000","'"')) !== false) {
    if (
count($a) <= 1) continue;
    if (
count($headers)==0) {
        
$headers $a;
        continue;
    }
    
$record = array();
    
$extid=$a[$headers2[$extidname]];
    for(
$i=0$i<count($a); $i++) {
        if (
$headers[$i] == "NA") continue; // NA for a column you want to skip
        
$field = array();
        
$field['name'] = $headers[$i];
        
$field['value'] = $a[$i];
        
array_push($record$field);
    }
    
// case update
    
if (isset($externalid2sg[$extid])) {
        
$field = array();
        
$field['name'] = 'id';
        
$field['value'] = $externalid2sg[$extid];
        
array_push($record$field);
        
addlog($handlew,"UPDATE,$extid,".$externalid2sg[$extid]);
        
array_push($logs"UPDATED,$extid,");
    } else {
        
addlog($handlew,"INSERT,$extid,".$a[$headers2['name']]);
        
array_push($logs"INSERTED,$extid,");
    }
      
array_push($records$record);
}
if (!
feof($handle)) die("Error: unexpected fgets() fail\n");

fclose($handle);addlog($handlew,'CMT,## import data');
$parameters = array(
    
'session' => $sessionid,
    
'module_name' => 'Accounts',
    
'name_value_lists' => $records   );

$json json_encode($parameters);
$postArgs 'method=set_entries&input_type=json&response_type=json&rest_data=' $json;
curl_setopt($curlCURLOPT_POSTFIELDS$postArgs);
$response curl_exec($curl);
$result json_decode($response);
$j 0;
foreach (
$result->ids as $id)
    
addlog($handlew,$logs[$j++].$id);

curl_close($curl); 
fclose($handlew);
?>

Tuesday, May 21, 2013

Concurrent update with Sugar

Today I would like to share with you a useful feature of Sugar that is not well known: the lock record when two users are updating the same record.

Let's assume that Will is editing the opportunity "ABC opty 100 units" in Sugar. He wants to update the amount. At the same time, Jim is editing this opportunity. Will is faster than Jim and finished the editing before Jim and saved the opportunity. When Jim is clicking on the save button, he is firing the Sugar optimistic Lock. Sugar is showing a special page that is asking to Jim if he wants to commit his modification or not. Jim can see exactly the differences:


If the change had not been made by Will but by an interface (API user) the result would have been exactly the same for Jim.

This feature is very useful and prevent the concurrent update of the same record by two users.

Blog is now fixed!

Hi readers,

these last days this blog was awfully slow due to a bad DNS configuration on my personal web site. External JavaScript files failed to load. Now it is fixed. I apologize for the inconvenience.

I will update this place soon. In the meantime, enjoy the cold spring!