Skip navigation.

Kubilay Çilkara

Syndicate content
Database Systems is a blog about Databases, Oracle, Salesforce and Data IntegrationKubilay Tsil Karahttps://plus.google.com/103901222720404137805noreply@blogger.comBlogger109125
Updated: 8 hours 7 min ago

API Integration with Zapier (Gmail to Salesforce)

Sun, 2014-09-07 11:42
Recently I attended a training session with +General Assembly  in London titled, What and Why of APIs. It was a training session focusing on usage of APIs and it was not technical at all. I find these type of training sessions very useful as they describe concepts and controlling ideas behind technologies rather than the hands-on, involved implementation details.

What grabbed my attention from the many different and very useful public and private API tools, 'thingies', introduced in this training session was Zapier. - www.zapier.com

Zapier looked to me as a platform for integrating APIs with clicks rather than code, with declarative programming. Is a way of automating the internet. What you get when you sign up with them is the ability to use 'Zaps', or create your own zaps. Zaps are integration of endpoints, like connecting Foursquare to Facebook or Gmail to Salesforce and syncing them. One of the Zaps available does that, connects your Gmail emails to Salesforce using the Gmail and Salesforce APIs and lets you sync between them. Not only that, but Zapier Zaps also put triggers on the endpoints which allow you to sync only when certain conditions are true. For example the Gmail to Salesforce Zap can push your email into a Salesforce Lead only when an email with a certain subject arrives to your gmail inbox. This is what a Zapier platform looks like:


An individual Zap looks like this and is nothing more than a mapping of the Endpoints with some trigger actions and filters.


The environment is self-documenting and very easy to use. All you do is drag and drop gmail fields and match them with the Lead, or other custom object Salesforce fields. Then you configure the sync to happen only under certain conditions/filters. Really easy to set-up. The free version runs the sync every 5 hours, well good enough for me. The paid version runs the sync every 5 minutes. 
There is even capability to track historical runs and trigger a manual run via the Zap menu. See below the 'Run' command to run a Zap whenever you like. 

In my case I used the tool to create a Zap to do exactly what I just described. My Zap creates a Salesforce Lead automatically in my Salesforce org whenever a 'special' email is sent to me. Great automation!
This is a taste of the 'platform cloud' tools out there to do API to API and App to App integrations with clicks and not code. With tools like Zapier all you really need is, imagination!
More links:
Categories: DBA Blogs

MySQL on-premise to Amazon RDS migration tips

Mon, 2014-07-14 13:27
Things to watch and do when migrating MySQL databases from ‘on-premise’ to Amazon AWS RDS

  • Not all versions of databases can be migrated to RDS. Especially if you want to do a 0 downtime migration. Make sure you know which versions are possible, at this writing Amazon announced that it  will support any old version of MySQL 5.1 and above.
  • In a zero downtime migration to Amazon RDS you work with mysqldump or mydumper to  import the baseline data and and then you use MySQL Replication and the binary_log  position to apply the additional records created during the import, the delta.  That is it is possible to create a MySQL slave in the Amazon AWS Clouds!
  • So when you have confirmed the on-premise MySQL that you have is  compatible you can then use mysqldump with the --master-data parameter to export your data including the binlog  position coordinates at the time of  the export. You can use mydumper if yor database is big to do this with parallel streams. You will use the coordinates and MySQL replication to catch-up with the on-premise master database when creating the MySQL slave in RDS.
  • Use different database parameters for different databases.
  • As  you load the RDS database using myloader or  mysql the operation might take long time depending on the size of your database. If this is the case, disable backups, it stops logging, try using one  of the better spec RDS Instance classes and IOPS for the duration of the operation. You can always downsize the RDS instance after you have  completed the initial load.
  • After you have completed the initial load, use Multi AZ which is a synchronous standby (in Oracle parlour) and schedule the backups immediately before you open your applications to the database, as initial backup requires a reboot.
  • Beware there is no SSH access to RDS, that means you have no access to the file system.
  • Get the DB Secuirty groups right and make sure your applications can access the RDS instances
Categories: DBA Blogs