Home » 2011

Yearly Archives: 2011

Salesforce.com as a Platform?

This post was written in 2006

In March 2006, I heard from several people in the non-profit technology consulting space, that using Salesforce.com as a platform for building web database applications for non-profits was a reasonable strategy to use.  It has several obvious benefits:

  • site hosted for free to non-profits, with access for up to 10 users
  • a growing body of consultants who are familiar with customizing salesforce
  • a growing marketplace of add-ons to salesforce
  • consistent web interface for all applications that users should feel comfortable with

I signed up for a developer account, to test it out, and I decided to implement one of the projects I have already deployed using ASP.NET 2.0 (Seattle SCORES).  This ASP.NET application is using SQL Server 2000 as its database, and it has about 13 main tables, and 8 lookup tables.  In looking at the time I spent to develop the ASP.NET 2.0 solution, it breaks down to the following:

  • 80 hours – core functionality (schema creation, pages for edit, insert, delete, view, search of all major tables in the application)
  • 22 hours – ajax based page to display and update sparse data (tricky, since it doesn’t map to a simple table you can display a grid on.)
  • 24 hours – reports
  • 16 hours – security, importing data, finishing touches
  • 9 hours – dealing with issues at CrystalTech (the ASP that hosts it)
  • 6 hours – meeting with client
  • 157 hours total

Now the surprising data about Salesforce: I’ve implemented the core functionality in 8 hours!  This compares to the 80 hours for the ASP.NET solution.

All is not perfect with the Salesforce solution, however.  There is no way to create a page which would deal with sparse data.  Thus the Salesforce solution would still need to have a page developed in .NET that reads & writes Salesforce data.  I’ve spent about 6 hours looking into this, and began to port my ajax based page to work in Salesforce.  I believe it could be ported over in a similar amount of time.  Unfortunately, it is more complicated that doing it against SQL Server, because you have a very limited query language to use against Salesforce data.  These limitations are:

  • only simple select statements against a single table
  • no joins allowed!
  • no grouping or sorting allowed!
  • no saved queries to reuse in additional queries.

For my ajax page dealing with sparse data, this means that instead of invoking a single user defined function I created in SQL Server to get the data I needed from 3 tables, I will have to run three seperate queries, and perform the joins in code to get the records I actually want!  Obviously, this is more complex and tedious, and harder to maintain.

Similarly, I have found the report writer in Salesforce to be very limited.  It only allows you to report on data from a single table, or two tables, if you have defined a master-child relationship between them.  It will only be adequate for the most simplest of reports.  To work around this problem, I see two potential solutions:

  1. Write a .NET application, or VBA project, to copy all the data to a SQL Server or Access database, and then develop your reports in Access or with the Report Writer in ASP.NET.
  2. Write a bunch of code for each report to extract the data the report needs into memory in a dataset, and then use the Report Writer in ASP.NET.

In my research, I did come across several companies who sell integration software which would make this much easier.  One sells a program which keeps your data in Salesforce accessible through linked tables in SQL Server.  It’s called DBAmp from forceAmp.com  This would allow you to do solution #1, without writing any code to copy the data.  It costs $695 for a one year subscription.  Their FAQ says the app will continue to work, even if you don’t update it, assuming Salesforce doesn’t update their API.

The other interesting software I saw and tried was an ODBC driver to Salesforce data from Open Access Software. This ODBC driver allowed me to create linked tables to the Salesforce data with Access, and easily use the Access report designer and gain the benefit of the QJet query processing engine.  It also enabled me to use Access to easily map and import data from an existing solution into Salesforce.  This driver comes with a 30 day trial, but after it expires, it is subscription based costing $99 a year per client machine, or $999 a year to install on a server machine.  Too bad there isn’t a free driver like this, because it definetly makes extending the Salesforce solution with any custom reports or pages much, much easier!


For straightforward web based, database applications, Salesforce is a very compelling choice.  It is free, and it is very quick to create a reasonable solution.  I’ve always said that I am 10 times more productive in Microsoft Access, than in creating a similar ASP.NET solution.  I would say the same thing for Salesforce, and the hours recorded in my experiment above support that.

There are however, some non-trivial issues that need to be better understood, to deal with complex pages, and data reporting, that more complex applications will run into.  I’ll need to research this more, before I can commit to making Salesforce my preferred platform for building web based database applications.  But it certainly seems promising!  Given the availablility of the ODBC driver, if the non-profit is willing and able to pay the subscription cost, I would feel very comfortable creating a solution with Salesforce.

Technologies before Salesforce

I use a variety of technologies to produce the appropriate solution depending on the non-profit’s needs, requirements, and infrastructure.  I strive to always create a solution which will have the lowest maintenance cost for the non-profit, by being easily administrable by the non-profit staff, and also by writing the smallest amount of custom code, so that other software consultants can easily comprehend and extend the solution, if needed.

Since 2009, I’ve focused exclusively on using Salesforce.com for building solutions.  Prior to that, I used a mixture of Microsoft platform technologies described below.

Frontend Choices

I typically build my solutions with either a Microsoft Access frontend database, or a Microsoft ASP.NET web application.  Both solutions work against Access or SQL Server databases.  The advantage of Access is that it is a very productive tool, thus allowing the creation of complex solutions in a reasonable timeframe.  It has a very powerful reporting tool, which frequently is needed to create the types of aggregate summary reporting that many non-profits need for their funders and themselves.  Also, there are many consultants who know Access, and thus it is easy to support an Access solution long term.

On the other hand, if the solution needs to be available to the public on the internet, then an Access frontend won’t work.  In this case, I will typically create a Microsoft ASP.NET solution, written with a mixture of Visual Basic on the server, and Javascript on the client.  I use AJAX (Asynchronous Javascript and XML) methodologies in my solutions in key areas where roundtrip performance would be too slow.

Backend Choices I typically build my solutions with either a Microsoft Access database (ie., a Jet database), or a SQL Server database.  As always, there are pro’s and con’s of each.  Access is easy to deploy, requires no server software to be licensed, works well in small workgroups, and works well with an Access frontend solution. On the other hand, SQL Server works better is some scenarios.  SQL Server can handle many concurrent users, is very robust to corruption issues (such as network packets being dropped), and is the recommended solution when the database will be hosted at an ISP.

Remote Support

When deciding on the appropriate technology for the solution, we must also consider requirements for remote access to the application.  If it must be on the internet and exposed to the public, ASP.NET is used.  If it must be accessible to multiple offices, but not the public, then a solution I have used successfully is to have staff use Terminal Server (called Remote Desktop Connections in Windows XP) to run the application.  This means the application is still running locally at their main office, and just screen images are going over the internet to their computer.

Mix and Match

The final solution is then some combination of these options.  Combinations I have used include:

  • Access frontend with a Jet backend (with or without usage of Terminal Server to support remote users).
  • Access frontend going through linked tables and SQL pass through queries to SQL Server (with or without Terminal Server to support remote users).
  • Access database project frontend going directly to SQL Server (Jet not involved).
  • ASP.NET web frontend with a Jet backend
  • ASP.NET web frontend with a SQL Server backend