The Benefits of Linking Data

John Pugh, CTO of Nathean Technologies, discusses the business value of linking and cross-analysing data from multiple sources and applications.

Information is increasing at a rapid rate across the average organisation.  A recently published report by The Bloor Group found that 71 percent of company executives, the majority of whom IT specialists, polled consider information silos “a significant issue.”

However, an overwhelming number of companies are operating with blind spots and are struggling to get the right information. This has driven the need for advancements in business processes and information management.  No longer is data analysis and reporting isolated activities – business leaders are recognising the need to integrate processes across functions into a single point.

Five Step Approach to Data Integration
The process of bridging data silos needs to be managed very carefully or else data can become dispersed across departments leading to an inconsistent and inaccurate picture of the business performance.  Companies are now looking to universal data to determine future plans and improve their operations but firstly, they must decide how they will go about bringing data sources together in the most cost-effective, productive way.

The first step an organisation should take when approaching any BI project is to clearly outline their key goals and objectives. These should be communicated clearly to business users and measured against regularly. Common objectives include; improved speed of access to data, empowerment of end users, addressing data related to other business areas and achieving cost savings.

The second step is to nominate a “data ambassador” who will act as the main driver behind the project.  Their role is to assist other users and address potential barriers along the way. Regardless of a BI strategy, every company should look to appoint a data leader who is responsible for data management in the business.

Thirdly, the organisation should decide what external data needs to be brought in to be analysed alongside internal data. This may include website statistics and social media metrics such as brand mentions, customer queries etc.  This will provide a complete 360 degree view of how the business is performing.

The fourth step is to seek a data discovery solution that encourages user empowerment, offers real-time access to information held in their systems and can effectively adapt to meeting changing business needs. It should also complement existing systems and tools as this tends to lead to a higher user adoption rate.

Finally, the company should use the solution to build a cross-functional bridge of communication and monitor results closely to make sure the original goals and objectives have been achieved.

Key Benefits of Bringing Data Together
Joining internal and external data together provides companies with a more complete picture of how their functions are performing and equips business users with the intelligence they need to make better decisions. It eliminates any guess work or intuition which may have been used previously.

The long term benefits of implementing an integrated BI solution far outweigh the cost of investment. In a highly competitive and fast paced market, having access to the right information at the right time can be a key factor in a company’s ability to respond to any changes.

Another positive effect is that it helps to eliminate internal data silos, making communication between functions more consistent and aligned. It is generally acknowledged that companies that provide their employees with up to date, real-time metrics outperform those that don’t.  This is because they are able to link actions with results more accurately and adjust their operations and processes accordingly.

Case Examples
According to a 2011 study conducted by IDC, the amount of data being stored by enterprises is doubling every two years. Today, all universities and local councils are sitting on vast quantities of data in their systems and applications. The volume and complexity of the data that grows across Finance, HR, Procurement and Payroll can be a real challenge. Yet achieving transparency and visibility is no longer simply a lofty goal, but a necessity of the modern world.

The consequence of not delivering these capabilities is often the heavy reliance that executives and managers have on spreadsheets and custom reports. Almost every resource planning system now has the ability to create reports and export them to spreadsheets for distribution. While this approach certainly has a place in an organisation, the best BI deployments relegate spreadsheets to a supporting role with the real intelligence being delivered in a validated BI environment that combines the agility of spreadsheets with the security of a central solution.

Fingal County Council, for example, implemented Nathean’s Enterprise analytics software to link and analyse data across collections, purchasing practices, resource planning and attendance management. They can now access and analyse their data at a more granular level allowing more detailed management reports to be produced. As a result, they are able to make better informed decisions, negotiate competitive terms with their suppliers and achieve cost savings.

Conclusion
Given the rapid growth in data and increasing number of incongruent systems, databases and applications, there is a clear need among organisations to bring data together and organise it for meaningful analysis. Technologies that deliver this can have a huge amount to offer customers over the next decade and beyond. We are living in an Era where Big Data and Web 3.0 are very much influencing the way we want to see information.

What are your thoughts on data integration? Please leave your comments below.

Big Data Scaling and Patterns

Christine Lynch, Technical Consultant at Nathean Technologies, looks at the growing importance of data scaling techniques and how big data patterns can help organisations solve frequently occurring problems…

“It is a very sad thing that nowadays there is so little useless information.” – Oscar Wilde

Moore’s Law, which is wildly accepted in the Computer Science world and even taken for granted, says that the processing power and storage capacity of computer chips doubles or their prices halve roughly every 18 months. This has led to the amount of digital information increasing tenfold every five years.

This data explosion brings with it new challenges for organisations. While most acknowledge the existence and importance of big data, they are struggling to fully exploit its potential. While new technologies have been developed to store and manage big data, this is only half the battle to gaining insights and value.

The Changing Perception of Big Data
Data has gone from being an unwanted overhead to a central player in business today. With basic programs becoming applications and applications going online the amount of data generated and processed forced a change in this attitude to the current thinking.

Central to this change in attitude and respect of data is the advancements in hardware. Without hardware evolving to the point where storing and accessing data became not only more efficient but cheaper, data in the volume we see today could not have occurred. Data is no longer discarded without thought or consideration to its value. Quite the opposite, in today’s online world even a user’s ‘data exhaust’ – the trails of clicks that internet users leave behind – is recorded and exploited.

Scaling Data Stores
The factors discussed have led to the growing importance of data scaling techniques, both in enhancing current systems and ensuring green field projects are future proofed for scaling. Never before has it been so important for a company to have reliable and fast data stores. New technologies and techniques are emerging to deal with this big data and its scaling. Traditional data stores of relational database management systems are vastly different to the new NoSQL data stores currently emerging.

Scaling data stores is a complex and varied science. There are numerous approaches one can take to increase performance and reliability. It used to be the case that one could only beef up a server in order to increase performance but with advancements in both server and communication technology most large data stores today not only have beefy servers but are spread across several machines in a distributed manner. These different approaches to scaling can be categorised as follows:

– Vertical scaling: adding more power to the machine.
– Horizontal scaling: adding more machines to a networks of machines

Vertical scaling techniques include adding more storage, memory or processing power. Horizontal scaling techniques include server clusters with high spec communication lines, sharding, replication and caching. All of these techniques can be applied to RDBMS and NoSQL data stores.

Big Data Patterns
Design patterns show how templates and frameworks can be applied to solve frequently occurring problems. The evolution of big data has seen the emergence of similar frequently occurring problems. Derick Jose, director at Flutura Decision Sciences and Analytics, calls these big data patterns ‘Workload Patterns’ and believes that they greatly assist in defining solution constructs for businesses.

These distinct and specific big data workload patterns can help businesses identify clearly what their needs are. Once a business has a pattern that fits its use case it is easier to wade through all of the big data technologies, and there are many, in order to find the best fit.

Applying workload patterns to big data solutions greatly reduces the potential complexity of a solution and clarifies the task at hand.

Concluding Thoughts
The wave of big data is here to stay and only going up. It has spawned a growing business sector of data analysts, data scientists and business intelligent professionals and tools. They crunch data in an unprecedented way to assist organisations in making sense of their proliferating data, often leading to massive cost savings and new business opportunities. According to an article by Kenneth Cukier, the data management industry is worth more than $100 billion and growing at almost 10% a year, which is roughly twice as fast as the software business as a whole. This monetary value associated with big data means that managing it has never before been so important.

The big data phenomenon is commonly referred to in the technology world using the three V’s: Volume, Velocity and Variety. First introduced by Doug Laney in his article on Data Management it describes how big data management strategies can be defined by the data’s Volume, Velocity and Variety. Volume being the physical amount of data an organisation can manage and store, Velocity the increasing rate at which data is flowing into an organisation and Variety, the diverse nature of data coming from different sources.

With this in mind the need to get value from big data and realise its potential is crucial to successful business development. Organisations need to put in place the business intelligent and data scientist personnel to deal with the variety of data, relevant technical structures to deal with their volume of data and be able to adapt easily to the changing velocity of data. With these in place an organisation can create meaningful patterns to plug into a pattern-based strategy to fully exploit and gain significant value from their data.

What are your thoughts on Big Data? Please leave your comments below.

A Look Ahead – Business Intelligence Trends and Developments

Robert Doherty, Product Director of Nathean Technologies, discusses BI trends and developments for the year ahead.

The BI market is evolving and changing rapidly, particularly in recent years. Here are my predictions for the top BI trends and developments for the year ahead.

1. In-Memory Solutions
If you follow what’s happening in the BI industry, you will have heard about “in-memory” BI solutions.  Although in-memory has been around for over three decades, its popularity has exploded in recent years.

In-memory refers to BI software that utilises an in-memory database for processing data, providing an alternative to data warehouses.  The most obvious reason for opting for in-memory is the speed of analysis that it offers. This is a significant benefit to business users as often the biggest bottleneck in typical BI applications is slow database access.  In addition to faster data access, in-memory analytics can reduce the need for indexing which in turn reduces IT costs and allows for faster implementation of BI software.

According to Gartner, the world’s leading technology research company, in-memory capabilities will continue to be an enabling technology. They will expand BI to a broader range of users, as more and more BI vendors incorporate it into their portfolios to deliver Google-like responses in exploring vast amounts of increasingly diverse data types via intuitive, yet sophisticated mobile BI tools and applications.

Projection

2. Self-Service Capabilities
Self-service BI continues to be sought after by companies in which their users want immediate access and control over data without having to rely heavily on the IT department.  Forrester Research supports this trend towards self-service, stating that self-service features of BI tools, such as semantic layers and search capabilities, will become increasingly critical when selecting and deploying BI tools and solutions.

One important factor that organisations have to consider when deploying a self-service BI tool is deciding the level of control users should get over business data. No control is unrealistic but too much control can have a negative impact on an organisation in the long-term. It’s about finding the right balance to suit the needs of the end users and IT staff. The right solution will offer the agility that self-service delivers while controlling the level of access users can get.

3. Multiple BI tools
Over the past decade, there has been a significant rise in the number of organisations deploying multiple BI tools. Rather than going against this trend, organisations should be embracing it. There is no vendor out there that can offer all of the necessary BI capabilities so why not get it from a suite of solutions? This ensures companies are getting the maximum value from their data while remaining as flexible as possible.

Keep in mind however, that having too many BI tools can have the opposite effect. Streamline or reduce the number of tools you may have to a manageable level and ensure the right management approach is in place to gain the most from each of them.

4. BI becomes more widespread 
There was a time when business intelligence was viewed as a luxury purchase. Nowadays, business intelligence has become a necessity for the vast majority of organisations due to the phenomenal growth of data.  Every company and every role has information requirements that cannot be met by spreadsheets alone.  According to a recent study by Gartner, 47% of survey respondents ranked data growth in their top three challenges.

BI has also become a lot more affordable. New cost-effective BI packages give SMEs the kind of data analysis capabilities only previously found in large multinationals. This makes BI far more accessible and scalable to small and large companies alike.

5. Agility is Key
According to Forrester Research, 70% of BI decision makers are facing business requirements that change monthly or even more frequently. Therefore, it’s no surprise that one of the biggest trends this year is a movement towards more agile BI solutions.

Traditional BI tools have never been fully responsive or flexible, which has led to a rising demand for agility.  Agile BI is characterized by a low Total Cost of Change (TCC). It is not only able to handle change better; it actively encourages business users and IT professionals to think about their data differently, to easily ask new questions and to visualize the answers in a variety of interesting & informative ways.

6. Social Integration
Social BI can deliver significant advantages as it provides point-in-time snapshots and opinions directly from the target audience. By integrating social media data with other data across the organisation, decision makers can get live, up-to-date feedback on anything they want. Data analytics can now be used to measure the performance of social media campaigns and gain a better understanding of brand awareness in the marketplace.

We are seeing organisations taking a more pragmatic view to reporting and analysis and focusing on their immediate key business drivers. Consequently businesses are investing in technology which starts to deliver results cost effectively in days not months or years.

The number one factor we see is time-to-value. Before investing in new technology businesses want to know how quickly they will they see benefits.

Visit www.nathean.com for more information about Agile Business Intelligence and how it can help your organisation.

What are your thoughts on these trends? Please leave your comments below.

Bringing True Agility to BI

John Pugh, CTO of Nathean Technologies, discusses the true value of Agile Business Intelligence and how it’s time for organisations to step away from using complex, traditional BI tools.

The BI market has evolved rapidly since the term “business intelligence” was coined back in the 80’s. There is no doubt that the market has matured considerable with a huge array of products and services to help organisations with their data management  projects–  predictive analytics, data warehousing and scorecards to name but a few.

While all these offerings provide some value to their users, they are typically targeted at the IT function.  Given that the majority of users in any given organisation are non-technical, it is no surprise to learn that one of the main causes of failure to BI projects is a low user adoption rate. Traditional BI tools are typically too complex for the average business user and very often there are significant costs involved with implementing them across the organisation.

shutterstock_95354707

Agile BI delivers information at the speed of thought

Business users are pressed to get answers quicker than ever before and a slow, traditional approach to analysing data can make decision-making extremely difficult. Who has the time to wait around for static reports when you can have the data you need right now, at this very moment in time without the elaborate development cycle?

Agile Business Intelligence 101
So what is agile business intelligence and why should organisations implement it? According to leading IT research company, Gartner, agile provides a streamlined framework for building business intelligence that delivers faster results using just a quarter of the developer hours of a traditional waterfall approach. Agile can cut project costs in half and drive project defect rates toward zero.

The benefits of moving towards an agile BI framework are significant.  Using the power of in-memory technology, users can get the specific answer they require, not just multiple possible answers.  This approach ensures they can respond rapidly to any changes in the organisation instead of spending time on further analysis and questioning.

Another key benefit of agile BI is the flexibility it offers. It is designed to change along with the organisation and their BI requirements.  In fact a 2011 survey of 200 businesses and IT executives conducted by Forrester found that 67% of respondents said that BI requirements change at least monthly.  It is widely acknowledged that BI changes more frequently than most other types of software projects. This is why flexibility is so important when it comes to selecting the right offering.

Ultimately, agile BI projects are all about the people, rather than the process itself. It puts the business user first and foremost, above all else.

Agile BI Case Example
A great case example is Fingal County Council who implemented Agile technology to link and analyse their data across collections, purchasing practices, resource planning and attendance management.  They decided this was the right approach to their data sharing challenge as it avoided the need for a data warehouse, which can often create backlogs and slow the entire process down considerably.   They wanted to create a connected network of information, as opposed to isolated data islands.

In a matter of days they could share insights and create a repeatable process between data sources. They could see instant results with very little effort. The business benefits were clearly evident – they were able to make better informed decisions, negotiate competitive terms with their suppliers and achieve real cost savings.

Big Data and the Agile Enterprise approach
“Big data” is a buzz term we are seeing used more often in the IT industry, particularly in the social media world.  With 340 million tweets being posted every day, it’s not surprising that marketers are struggling to get a handle on the huge volume of data stored online.

According to Gartner, big data is the term adopted by the market to describe extreme information and processing issues which exceed the capability of traditional information technology along one or multiple dimensions to support the use of the information assets.

You would imagine that in order to make sense of big data it would probably mean spending big money. However, I believe that by implementing agile BI technology, companies of any size can achieve real benefits for a moderate investment.  The reasoning behind this is that agile BI puts the ability to gain insights into the hands of the users, while at the same time keeping the operational costs as low as possible. It removes the need for spreadsheets and custom reports, making analysis of huge volumes of data an easy task.

The Future of Agile BI
Organisations across the world are recognising the need to move away from earlier-generation applications and embrace next generation agile technology.  Business users increasingly need to be more connected to information and this means having instant access to the data they require, at the right time and in the right way.

Given the turbulent economic climate, businesses cannot afford to operate with a lack of agility across their practices and functions.  Therefore, there needs to be a move away from traditional BI approaches such as silos and centralisation, which would have been hugely popular in the 90’s.

We are already seeing the power of agility being harnessed in areas such as computer based learning and energy conversation.  Delivering a successful agile BI project means getting the balance right between having the appropriate standards in place to support change while being consistent.

What are your thoughts on Agile BI? Please leave your comments below.

Quick Poll

The Power of Simplicity

Maurice Lynch, CEO of Nathean Technologies, discusses how it takes a bit of effort to make something simpler but the results are surprisingly more powerful.

You never hear someone say “that’s so complex, why didn’t I think of that?”.

People respond well to simplicity. We reward those who make simple gadgets and berate the things that make simple tasks difficult. Too often we are overloaded with options, which sounds great, but having too many options can slow down the decision making process. What’s my best option? The goal for taking the time to simplify is to end up with something superior and more useful than the original. Yet to simplify anything is oddly a difficult thing to do. How do you decide what to leave in and what to take out? How much time should you spend with the scalpel?

Blaise Pascal

“The present letter is a very long one, simply because I had no leisure to make it shorter.”  – Blaise Pascal – 17th century French mathematician, physicist, inventor, writer and  philosopher.

The argument for simplicity is compelling. Great leaps in philosophy, engineering, mathematics, physics and so on have been because someone took some ‘leisure time’ to think about a particular problem and reduce it to its essence. When people wanted to believe that the Earth was the centre of our solar system astronomers had many convoluted mathematical formulas, diagrams and assumptions to prove it, or so they thought.  The truth was much simpler to explain. Occam’s Razor, known as the law of economy, states that among competing hypothesis the one with the least number of assumptions should be selected – or by example, if you hear galloping hooves it’s safe to assume it’s horses and not zebras!

In the 1940s Alan Turing reduced the concept of ‘information’ into its simplest form (1’s and 0’s) which one day could be read and processed by a machine. Binary codeThe digital age would not have happened, or certainly would not have happened when it did, without such thinking. More than likely you’re reading this on some device and you are able to see it because everything has been reduced into a machine readable sequence of 1’s and 0’s. It doesn’t get much simpler than binary.

Software, however, has a unique quality in that there is no system feedback that there are too many features or options. Give a computer more memory and disk space and you can keep loading on the features. In engineering the laws of physics provide the checks and balances…overload the bridge and it collapses. So, there is a level of discipline required in software design to keep it simple yet powerful.

In our company we strive for simplicity. Our tagline from day 1 has been “Data Analysis Made Easy”…but do we always get it right? Absolutely not! Our definition of simplicity may not always be how the customer sees it and we have to continuously think about how to make our products simpler to use and at the same time more powerful. We look at traditional methods of building business intelligence solutions and think about new approaches to deliver simpler, more agile and more customer interactive solutions. And here’s the point, simplification needs to be a first class activity so I’m going to follow through and simply stop writing now…!

The Role of Analytics in CRM

Robert Doherty, Product Director of Nathean Technologies, discusses the role of analytics in CRM.

For too long many people viewed sales as an art, based on gut-feel and instinct, rather than a science. In the latter half of the 20th century, primarily led by sales experts in the US, the “mystery” was removed from sales and today most people believe that the key to successful sales performance is a well understood process that brings the sales person and their customer to a successful deal through a series of well-known stages. Most modern CRM solutions support this process driven sales effort but unfortunately for sales managers they do not provide the agile business tools that are required to accurately manage & monitor the sales effort.

When I consider the role of analytics in CRM today there are two quotes that instantly spring to mind. The first is from Peter Drucker, who is often referred to as the man who invented management. He said, “You can’t improve what you don’t measure”. The second appears in many variants but the one I like is, “Sales people deliver on what you inspect not what you expect”.

The failure of CRM solutions to allow us to measure our most important sales metrics so that we can improve sales efficiency and forecasting accuracy is the reason that Best-in-Class organisations are turning to third party analytic solutions in ever increasing numbers. A recent report from the Aberdeen Group entitled, “Better Sales Forecasting through Process and Technology”, proved that 81% of the best performing sales organisations utilise performance dashboards compared to 54% of other organisations. These organisations are also planning to increase their investment in sales analytics by, on average, 13.1% year-on-year so that they maintain their perch at the top of the sales tree.

Sales Team - CRM Analytics

Key CRM Analytic Challenges
In my opinion there are four key challenges that must be overcome in order to produce meaningful measures of sales activity from CRM systems.

The first challenge, and the one that is most common, is the quality of data that exists in the CRM system itself. Sales people by their very nature tend not to be inclined to spend time recording meaningful information about their leads and opportunities. They want to get to a deal and can often see CRM as a bureaucratic bump on the road to earning commission. Successful sales organisations recognise this personality trait and take action to overcome it. Regular CRM training is an essential part of this. Many leading companies seek to automate as much of the data input as is possible by linking to other systems, for example voice calls or email, or by introducing automatic triggers that move leads and opportunities through stages so that the sales reps data input is minimised. In my experience analytics can help here too by providing a set of exception reports and metrics that monitor data quality. This allows sales managers to quickly identify problem areas and to take remedial action when required.

The second issue that needs to be dealt with is the removal of the “human factor” in sales forecasting. Even when an organisation has a defined sales process with sales stages and associated deal closing probabilities, the sales rep can still overstate their monthly, quarterly or annual forecast through over-confidence or a lack of willingness to face the hard truth. The best performing businesses use analytics to identify this problem and to minimise its impact on forecasting accuracy. Past opportunities are used to predict how current leads will pan out. Facts such as age of opportunity, last activity or next planned activity can be used to more accurately predict the chance of closing in the forecasted timeframe.

Thirdly, I often find that organisations do not measure their “deal velocity”, which looks at the ideal progress of a lead through the various sales process stages and identifies where they are currently faltering and are therefore unlikely to end in a successful deal. Again, analytics can solve this problem by presenting clear metrics that identify exceptions to the sale manager, which he or she can then quickly investigate. Sales reps can then be re-trained or sales processes can be adjusted to produce a more efficient & better performing sales team.

Lastly, I firmly believe that CRM analytics should not be restricted to either the sales team or to the senior management in an organisation. Best-in-class organisations realise that sales forecasts have an impact not just on the organisation’s total revenue but have a knock-on effect in areas such as staff hires, supply chain, logistics and purchasing. These organisations ensure that stakeholders in all these areas have access to the right tools to enable them to plan for the future with confidence.

Selecting the Right Analytics Platform
In my opinion the key objective when selecting an analytics tool for CRM (and indeed for any Enterprise business intelligence tool) is that all stakeholders must be able to get the answer to their key questions in an acceptable amount of time. In today’s world that often means that information must be available in real-time. Indeed the Aberdeen Group report that amongst top performing companies, 50% provide real time sales forecasting analysis with 100% being able to pull a sales forecast in less than 2 hours. Contrast that with the worst performing organisations where 31% take more than 4 hours to generate a sales forecast.

If an organisation wishes to match the performance of their top ranked peers they must select an analytics platform that:

  • Supports real-time analysis of their CRM system whether it is on-premise, hosted or in the cloud.
  • Allows power users to build new rich datasets that can be used in a variety of agile ways by the various tools.
  • Provides easy-to-use tools for users of all capabilities including:
    – Dashboards so that key metrics are presented visually and are easily understood. Exceptions are identified & highlighted and anomalies can be quickly investigated using drill-down functionality.
    – Analytics so that metrics can be easily created, represented visually, quickly added to dashboards and shared with colleagues.
    – An end-user query tool so that users can ask their own ad-hoc questions and can share insights
  • Has a low Total Cost of Ownership (TCO) so that on-going training, maintenance & development costs are minimised and a low Total Cost of Change so that as the business evolves the platform can react quickly & economically.
  • Connects to other business systems and combines data with CRM to increase the breadth of knowledge available. Some real world examples are introducing marketing data so that campaign to sales statistics can be generated, finance data so that actual sales can be compared to sales forecasts, Payroll & HR data so that cost of sales can be more accurately calculated.

Business Benefits
The leading 20% of companies that implement CRM Analytics as described above report:

  • A 90% customer retention rate versus 76% for the middle performing 50% and 41% for the worst performing 30%.
  • A 13% year-on-year increase in sales quota achievement versus a 1.3% increase (middle 50% ) and a 5.2% decline (worst 20%).
  • A 6.5% year-on-year reduction in average sales cycle versus 1 1.1% reduction (middle 30%) and a 4.7% increase (worst 20%).

For these companies total revenue increases 8.6% year-on-year, margin increases by 4.7% and lead conversion improves by 1.2%. In addition to these measurable figures and statistics, general confidence improves in the company and all departments can better plan for the future.

CRM Analysis Trends
The world of CRM Analysis is evolving all the time as more and more organisations see the value that can be unlocked in their sales data and the benefits that can then be accrued. Some of the trends that can be seen in the market include:

  • The addition of social media data to the CRM mix. More and more organisations are using social media to spot opportunities and gauge public reaction to their products, their competitors and their industry. While the value of social media data is still a little unclear to many (see Facebook share price) we expect to see it play an ever more important role in the coming years.
  • The move to mobile to continue. Everyone has seen the statistics relating to smart phone and Tablet adoption rates. Analytics are already available on these devices and we expect this trend towards analytics-on-the-go to gain more and more traction. With this in mind the impact of Microsoft’s new Surface device will be of interest in the short term.
  • CRM Analytics in the Cloud. Many of the market leading CRM products are of course already cloud-based including Salesforce and SugarCRM. There are already cloud based analytic options for these solutions but currently they do not lend themselves to easily integrate with other data sources that may or may not also be cloud based including Finance, HR, Payroll & Marketing data. We see this integration as an essential part of CRM analysis and expect to see some new & very clever cloud-based solutions soon.

What are your thoughts on CRM analytics? Please leave your comments below.