Enterprise Open Source Alternatives

11 January 2011

SOS Open Source get started by talking about why open source selection matters and then giving tips to find open source software and suggestions about how to evaluate open source.

Later we shared excerpts from evaluation reports ranging from software quality tools and wiki platforms, to how to select project management alternatives and open source forks.

The goal of this article is to share ways of finding open source candidates and how to compute metrics.

Choosing the right open source alternative.

Why open source selection is important has already been discussed before, but the effort to choose the right open source program for your needs could vary a lot. See the following examples:

  • Linux: you don’t need to be a rocket scientist to know that is a mature and stable project led by a community, very popular and largely deployed, with many vendors providing Commercial Support, Training and other value added services. It still holds that to choose the ‘right’ Linux distribution for your company requires some home-work, though.
  • Experts like facebook’s gifted open source developers can start new open source projects or just join existing ones, but achieving success by just following their footprints maybe tricky. In fact among projects they use you find codemode - that 16 months after its first release has been dismissedFlashCache, that is backed only by facebook, or the distributed memory object caching system Memcached, a stable and maintained community-led project.

Unless you are going for the obvious names or your roots are deep in the open source world - either if you are a user interested in using open source (borrow or buy) or a maker willing to build software using it – you better take your time to ponder before selecting, and to try before buying or implementing. All your typical needs, that will include the desired level of support, stack certifications, etc must be carefully assessed and taken into account at evaluation time.

How to build your own “My Open Source Favorite List”.

Digging through your favorite open source directories and repositories looking for candidates is a good start.

Meta-forges, open source news sites (e.g.: Freshmeat), online book stores (e.g.:PacktO’Reilly or Amazon) and tools like Antelink – useful to calculate the reuse of open source projects -  can help you a lot to make your mind about a project.

How we Do it.

SOS Open Source first goal actually was to identify metrics that could be used to evaluate open source projects fastly and reliably. We went through all existing qualification and selection methodologies – included OSMM by Navica, OSMM by Cap Gemini, BRR and QSOS.

QSOS’s metrics resulted the most comprehensive, we selected the most objective of them, defining marks and grades if needed.

SOS Open Source for each metric provides the assessor with marks and references, so that giving grades it is easy and well documented. Customers once trained to use SOS Open Source methodology and tools can assess a project in half an hour on average. Draw up a shortlist of 6-8 candidates, evaluate and compare them takes about 2 days.

The full list of metrics is detailed below.

Metric name: Code maturity [< 1 year, 1-3 years, > 3 years]
Origin: QSOS (Age), but with a different (extendible) time frame.
How to compute: browsing forges/meta-forges (please note that sometimes projects are not released in open source from the very first day; moreover source code could be moved from a forge to another forge).

Metric name: Code stability [unstable, stable but old, stable and maintained];
Origin: QSOS (Stability).
How to compute: browsing forges/meta-forges and bug-tracking systems.

Metric name: Project popularity [unknown, small popularity but growing trend, well known];
Origin: QSOS (Popularity), but considering also the trend.
How to compute: using Social Media search tools.

Metric name: Case Study Availability [unkown, case studies available only on the website, case studies available on the net];
Origin: QSOS (Reference), but instead of focusing on critical/non critical usage we emphasized the importance of case studies and where they are found.
How to compute: using search engines on specific sites and on the net.

Metric name: Books availability [none, few, many];
Origin: QSOS (Books), but using a relative scale.
How to compute: browsing online bookstores and searching on project’s websites.

Metric name: Community management style [benevolent dictator, company-led, community-led]
Origin: QSOS (Management Style), adding company-led, and merging “Complete dictatorship” and “Benevolent Dictator”.
How to compute: Browsing project’s websites (e.g.: looking for Developers’ zone, Community Section, etc).

Metric name: Team size [1-5 members, 5-10 members, > 10 members].
Origin: QSOS (Leading team), but using different dimensional clusters.
How to compute: Analyzing commits.

Metric name: Commercial support [n/a, available only in a geographic area/lang, available from multiple
vendors in different languages];
Origin: QSOS (Support), but we consider only commercial support services with a clear SLA, and we look also at geographic availability.
How to compute: Browsing project’s and vendors’ websites (e.g.: looking for Support/Services/Consulting section) and searching on the net.

Metric name: Training [n/a, available only in a geographic area/lang, available from multiple
vendors in different languages];
Origin: QSOS (Training).
How to compute: Browsing project’s and vendors’ websites (e.g. looking for Training/Certification sections) and searching on the net.

Metric name: Documentation [n/a, available only in one language, available in  many languages];
Origin: QSOS (Documentation), quantity and variety of available documentation is considered by our methodology.
How to compute: Browsing project’s and vendors’ websites (e.g. looking for Documentation/Resources sections) and searching on the net.

Metric name: QA process [n/a, existing but not supported by tools, supported by tools];
Origin: QSOS (Quality Assurance), but using a different scale.
How to compute: Browsing project’s websites.

Metric name: QA tools [n/a, existing but not much used, very active use of tools];
Origin: QSOS (Quality Assurance), but Automatic Testing processes are not assessed (rare, and difficult to verify).
How to compute: Browsing project’s websites (e.g.: looking into developers’ zone/how-to-contribute sections).

Metric name: Bugs reactivity [poor, formalized but not reactive, formalized and reactive];
Origin: QSOS (Quality Assurance).
How to compute: Browsing project’s websites (e.g. bug-tracking systems/forums).

Metric name: Source [to be compiled, binaries available, virtual appliance available];
Origin: QSOS (Source), but focused on the easiness of installation (source availability is always a pre-condition).
How to compute: Browsing download’s pages.

Metric name: Red Hat/Solaris/Windows [n/a, supported by third parties, certified by Red Hat/Oracle/Microsoft];
Origin: QSOS (Packaging), but giving the highest mark to projects certified by vendors.
How to compute: Browsing download’s pages and vendors’ websites.

Metric name: Amount of comments [none, poorly commented; well commented];
Origin: QSOS (Quality of Source Code), but focusing on the amount of comments (relative parameter).
How to compute: browsing forges/meta-forges or using tools to inspect the source code.

Metric name: Computer Language used [more than 3 languages, one main language, one unique lang];
Origin: QSOS (Technological dispersion), but with a different scale.
How to compute: browsing forges/meta-forges or using tools to inspect the source code.

Metric name: Code modularity [not modular, modular, available tools to create extensions].
Origin: QSOS (Modularity), but stressing out the importance of tools availability.
How to compute: Browsing project’s websites, especially documentation/sections for developers.

Metric name: License [copyleft, corporate, permissive];
Origin: QSOS (License Permissiveness).
How to compute: Browsing project’s websites.

Metric name: Modifiability [no way to propose modification, tools to access and modify code available but the process is not well defined, tools and procedures to propose modifications are available];
Origin: QSOS (Modification of Source Code).
How to compute: Browsing project’s websites, especially issue/bug-tracking systems and forums.

Metric name: Roadmap [n/a, not detailed roadmap available, detailed roadmap available];
Origin: QSOS (Roadmap).
How to compute: Browsing project’s websites.

Metric name: Sponsor [unique sponsor, community sponsor, foundation/consortium sponsor].
Origin: QSOS (Sponsor), but using a different scale that values “no sponsor” (community) higher than unique-sponsor.
How to compute: Browsing project’s websites.