I recently was honored as a finalist in the Microsoft Alumni Foundation’s 2012 Integral Fellows award. From the MSAF’s website:
The Integral Fellows award recognizes and supports a Microsoft alumnus who has made a meaningful difference in the daily lives of others by using his/her talents, time and resources to contribute to the world, whether on a local, regional, national, or global scale.
Well I didn’t win, but I’m still quite honored to have been a finalist among this group of outstanding individuals who are doing so many great things here and around the world!
Having been developing exclusively on Salesforce.com for the last 3 years, I can say it is my preferred platform! As a solution for non-profits, I believe it has many great benefits:
- 10 free enterprise licenses (worth over $15,000 a year)
- Remotely managed by Salesforce, so your data is safe, secure, and backed up.
- A system that can grow to handle many different needs in the non-profit, including donor management, volunteer management, email lists, client lists, service tracking, etc.
- A 3rd party marketplace, AppExchange, that provides additional applications such as bulk emailing tools, event management, auction management, website integration. Many of which are provided at discounts to non-profits.
The features I love most as an application developer include
- I don’t have to install and help maintain software that is located at the non-profit on every staff member’s computer. All that is needed is an internet connection and a browser.
- I don’t have to spend lots of time creating forms and pages for each new piece of functionality I add to the system – all the pages are consistently provided by Salesforce.
- Common subsystems and applications I create can be packaged and shared between non-profits.
- Salesforce continues to add new features and evolve the system, while always maintaining strong support for older applications.
I still fret that the reporting tool built into Salesforce is not as powerful as Microsoft Access. But I am hopeful that it will continue to improve, and I know there are more 3rd party tools that make reporting easier. Luckily, lack of reporting has not been an issue for any of the solutions I’ve created with Salesforce so far.
I was quite surprised to recently see that Microsoft is closing down all web sites running on Office Live in April 2012. Current Office Live users are going to be offered 6 months free on Microsoft’s Office 365 service, and then will need to pay. OK, I can accept that you can’t expect free forever. But here’s the real surprise: you must manually recreate your website, as well as manually migrate your contacts, email, calendar, and documents. No thanks!
If I have to recreate my website, I’m going to find a better CMS than Microsoft provides! I’ve been frustrated multiple times with Office Live losing all my content when changing themes! Never again!
So now I’m trying out WordPress, and so far, I’m happy!
This post was written in 2006
In March 2006, I heard from several people in the non-profit technology consulting space, that using Salesforce.com as a platform for building web database applications for non-profits was a reasonable strategy to use. It has several obvious benefits:
- site hosted for free to non-profits, with access for up to 10 users
- a growing body of consultants who are familiar with customizing salesforce
- a growing marketplace of add-ons to salesforce
- consistent web interface for all applications that users should feel comfortable with
I signed up for a developer account, to test it out, and I decided to implement one of the projects I have already deployed using ASP.NET 2.0 (Seattle SCORES). This ASP.NET application is using SQL Server 2000 as its database, and it has about 13 main tables, and 8 lookup tables. In looking at the time I spent to develop the ASP.NET 2.0 solution, it breaks down to the following:
- 80 hours – core functionality (schema creation, pages for edit, insert, delete, view, search of all major tables in the application)
- 22 hours – ajax based page to display and update sparse data (tricky, since it doesn’t map to a simple table you can display a grid on.)
- 24 hours – reports
- 16 hours – security, importing data, finishing touches
- 9 hours – dealing with issues at CrystalTech (the ASP that hosts it)
- 6 hours – meeting with client
- 157 hours total
Now the surprising data about Salesforce: I’ve implemented the core functionality in 8 hours! This compares to the 80 hours for the ASP.NET solution.
All is not perfect with the Salesforce solution, however. There is no way to create a page which would deal with sparse data. Thus the Salesforce solution would still need to have a page developed in .NET that reads & writes Salesforce data. I’ve spent about 6 hours looking into this, and began to port my ajax based page to work in Salesforce. I believe it could be ported over in a similar amount of time. Unfortunately, it is more complicated that doing it against SQL Server, because you have a very limited query language to use against Salesforce data. These limitations are:
- only simple select statements against a single table
- no joins allowed!
- no grouping or sorting allowed!
- no saved queries to reuse in additional queries.
For my ajax page dealing with sparse data, this means that instead of invoking a single user defined function I created in SQL Server to get the data I needed from 3 tables, I will have to run three seperate queries, and perform the joins in code to get the records I actually want! Obviously, this is more complex and tedious, and harder to maintain.
Similarly, I have found the report writer in Salesforce to be very limited. It only allows you to report on data from a single table, or two tables, if you have defined a master-child relationship between them. It will only be adequate for the most simplest of reports. To work around this problem, I see two potential solutions:
- Write a .NET application, or VBA project, to copy all the data to a SQL Server or Access database, and then develop your reports in Access or with the Report Writer in ASP.NET.
- Write a bunch of code for each report to extract the data the report needs into memory in a dataset, and then use the Report Writer in ASP.NET.
In my research, I did come across several companies who sell integration software which would make this much easier. One sells a program which keeps your data in Salesforce accessible through linked tables in SQL Server. It’s called DBAmp from forceAmp.com This would allow you to do solution #1, without writing any code to copy the data. It costs $695 for a one year subscription. Their FAQ says the app will continue to work, even if you don’t update it, assuming Salesforce doesn’t update their API.
The other interesting software I saw and tried was an ODBC driver to Salesforce data from Open Access Software. This ODBC driver allowed me to create linked tables to the Salesforce data with Access, and easily use the Access report designer and gain the benefit of the QJet query processing engine. It also enabled me to use Access to easily map and import data from an existing solution into Salesforce. This driver comes with a 30 day trial, but after it expires, it is subscription based costing $99 a year per client machine, or $999 a year to install on a server machine. Too bad there isn’t a free driver like this, because it definetly makes extending the Salesforce solution with any custom reports or pages much, much easier!
For straightforward web based, database applications, Salesforce is a very compelling choice. It is free, and it is very quick to create a reasonable solution. I’ve always said that I am 10 times more productive in Microsoft Access, than in creating a similar ASP.NET solution. I would say the same thing for Salesforce, and the hours recorded in my experiment above support that.
There are however, some non-trivial issues that need to be better understood, to deal with complex pages, and data reporting, that more complex applications will run into. I’ll need to research this more, before I can commit to making Salesforce my preferred platform for building web based database applications. But it certainly seems promising! Given the availablility of the ODBC driver, if the non-profit is willing and able to pay the subscription cost, I would feel very comfortable creating a solution with Salesforce.
I use a variety of technologies to produce the appropriate solution depending on the non-profit’s needs, requirements, and infrastructure. I strive to always create a solution which will have the lowest maintenance cost for the non-profit, by being easily administrable by the non-profit staff, and also by writing the smallest amount of custom code, so that other software consultants can easily comprehend and extend the solution, if needed.
Since 2009, I’ve focused exclusively on using Salesforce.com for building solutions. Prior to that, I used a mixture of Microsoft platform technologies described below.
I typically build my solutions with either a Microsoft Access frontend database, or a Microsoft ASP.NET web application. Both solutions work against Access or SQL Server databases. The advantage of Access is that it is a very productive tool, thus allowing the creation of complex solutions in a reasonable timeframe. It has a very powerful reporting tool, which frequently is needed to create the types of aggregate summary reporting that many non-profits need for their funders and themselves. Also, there are many consultants who know Access, and thus it is easy to support an Access solution long term.
Backend Choices I typically build my solutions with either a Microsoft Access database (ie., a Jet database), or a SQL Server database. As always, there are pro’s and con’s of each. Access is easy to deploy, requires no server software to be licensed, works well in small workgroups, and works well with an Access frontend solution. On the other hand, SQL Server works better is some scenarios. SQL Server can handle many concurrent users, is very robust to corruption issues (such as network packets being dropped), and is the recommended solution when the database will be hosted at an ISP.
When deciding on the appropriate technology for the solution, we must also consider requirements for remote access to the application. If it must be on the internet and exposed to the public, ASP.NET is used. If it must be accessible to multiple offices, but not the public, then a solution I have used successfully is to have staff use Terminal Server (called Remote Desktop Connections in Windows XP) to run the application. This means the application is still running locally at their main office, and just screen images are going over the internet to their computer.
Mix and Match
The final solution is then some combination of these options. Combinations I have used include:
- Access frontend with a Jet backend (with or without usage of Terminal Server to support remote users).
- Access frontend going through linked tables and SQL pass through queries to SQL Server (with or without Terminal Server to support remote users).
- Access database project frontend going directly to SQL Server (Jet not involved).
- ASP.NET web frontend with a Jet backend
- ASP.NET web frontend with a SQL Server backend