Posts Tagged ‘ Qlikview ’

The Pre-Sales Diary:Great but Too Expensive Dear!

Thwack!

The apparent feeling of being dumped by a potential customer can be materialized by a host of many available options for such potential customers. Although there is no harm (apparently) for being rude to the sales team but just for the sake of better euphemism, these no longer ‘potential’ customers can simply blame their distaste of your products/services or whatever you do to being overpriced!

Although a fashion statement in some novelty industries and an admired trait, most enterprise business software taboo out the pricy tagging.

First of all, we all know it, IT notoriously sucks the money out of a business, especially when there is no enterprise strategy around, IT is definitely a pure cost center. That is why they invited you to sell them business performance management and intelligence software to get share of the corporate ‘strategy’ cake.

But you don’t like to understand any of this, you spent a lot of resources in time and people to execute a sales cycle, raised expectations, probably gave a proof of concept with purpose, all to listen to the once potential customer spit out the devilish decree. It all starts with ‘But…….‘ and follows a variation of ‘Your solution is too expensive‘ or ‘we can’t allocate the budget for it‘, ‘we don’t make the final purchasing decisions’, etc etc.

In reality, this is just a polite way of saying that you didn’t meet the expectations or weren’t able to create the right value of your products/services.

How do you cater to this catch-22 situation?

Many  survivors tell us some common strategies, including:

  1. Price Justifications (e.g. Our product works in zero gravity, our costs are only upfront heavy, incremental upgrades are very cheap)
  2. Price Distractions (e.g, we have overall very low TCO)
  3. Competitor Demeaning (e.g. The competitors have lousy products and are thus cheap) (Pun Intended)
  4. Bargaining (e.g. whats your budget, let us fit something for you, else we will definitely, ultimately come down to your level)
  5. Reinventing the Sales Wheel (e.g. Lets try again, lets talk again, let us repeat our efforts to emphasize why we are not so affordable)
  6. Reassess our own Assumptions about the Expectations and Value Offered (e.g. Does the customer really know what they can get as true ROI, is our product redundant, can they solve pain point using other lesser expensive solutions)

The reality is, most of these techniques are pretty frequently used, some of them are quite demeaning (e.g. 3), but in most cases, the bottom line is, you need to set the Expectations straight, and such an objection raised only indicates the lack of effectiveness to do the same.

Once the objection is raised, ask the prospect what should the product/service have more for him to rethink the budget?

He would either give you the points for mending the gaps or acknowledge your product fitness to be good.

For the former case, if the points mentioned are offered in your products/services with a workaround or a doable approach, go ahead, you have nearly resolved the objective.

If the prospect is unable to provide any missing points, then you need to re-emphasize on the need, figure out the real decision makers (if he/she sites others for budget approval), or figure out the true ‘champions’ and ‘villians’ in your deal. Most likely, you will find out that your current assessment is different from your initial assessment.

Apply the changes only, this will set new Expectations and hopefully hopefully you will have the objection resolved, your product/services will be valued the way you wanted or pretty close to that.

Happy Selling!

Advertisements

The Pre-Sales Diary: Data Profiling before Proof of Concepts

The Raison D’Etre for many Pre-Sales Engineers is to carry out Proof of Concepts. Although for most of the potential leads, Proof of Concepts are to be avoided because they incur greater costs in the sales cycle, increase the sales closing time, increases chances of failure but there are certain cases where proof of concepts are really much more helpful for the Sales cycle then anything else.

Some of these cases include when there are competitors involved touting the same lingo/features/capabilities etc, others include a genuine customer scenario which needs addressing in a proof of concept either because the scenario is pretty unique, it is part of their due diligence, or your product hasn’t been tested on those waters before.

Pre-Sales folks are pretty comfortable on their technology which they like to showcase to such customers but they are totally new to the customer’s scenario. There are always chances of failure and there are many failures abound.

Before embarking on a scope for a proof of concept and promising deliverables, it is more than required, infact mandatory not just to analyze the customer organization, but also processes, metrics and ofcourse data.

The last part is where I find most proof of concepts depending on. Everything is set, you took extensive interviews with the stakeholders and know what needs to be ‘proved’, you scoped out a business process or two, figured out some metrics and one or two KPIs and they gave access to their data pertaining to it. Now the ball is in your court, but before you know it, your doomed!

The data is incomplete, inaccurate, and have tons of issues which data governance and MDM were meant to solve but didn’t, they don’t exist yet. In most likelihood, the customer is quite unaware of such issues, that is why you are offering them a Business Intelligence solution in the first place, to tap into their data assets. They have never done so before themselves or done so quite limited way to be able to uncover such obstacles. In other scenario when they are aware of these issues, they either are unable to tap it or it is a trick question for you, they want to check whether you cover this aspect or not.

You can either proof the ‘time’ challenge by jumping right into the proof of concept and ignoring all standard practices which are pretty standard during project implementations but then you ignore all of them (or most of them) simply because ‘its just a demo’!

Kaput!!!!

I always carry out a small data survey activity before promising any value to be shown in the proof of concept to make sure what we have in store before we can do anything. Simple rule, GIGO – Garbage In, Garbage Out. If you want to have a good quality, successful demo, profile your data first, understand the strengths and weaknesses and above all let the customer know fully about the limitations, if possible, get enrichments in your data based on your profile to make your demo successful.

This one single step can lead to drastically different outcomes if it is performed or not.

Data Profiling:

Data Profiling is defined as the set of activities performed on datasets to identify the structure, content behavior and quality of data. The structure will guide you towards what links, what is missing, do you all have the required master data, do you have data with good domain representation (possible list of values), what granularity you can work with. Content Behavior guides you on what are the customer’s NORMS in terms of KPI and metric values. e.g. if the dataset contains age groups of 40+, then there is no need to showcase cross selling market basket targeted to toddlers. You can simply skim it out, or ask for data enrichment. if you dont data pertaining to more than one year, then you can’t have year’ as a grain level which for certain metrics and analysis might be critical. Data Quality assessment, albeit a general one, can save you many hours ahead. Most notable of quality issues are data formats, mixed units of measurements, spell checks. e.g. you have RIAD, RIYADH, RIYAD, RYAD all indicating the same city, mixed bilingual datasets like names and addresses etc.

There are many tools available out there which can aid in Data Profiling, including the ubiquitous SQL and Excel. However, Data Profiling, being a means to an end and not the end in itself does not warrant more time and energy than required, there fore a purpose built RAD enabled data Profiler is one of your most critical investments in your toolbox.

One which I have come across recently and which fits the bill very nicely is Talend OpenProfiler, a GPL-ed, Open Source and FREE software which is engineered with great capabilities and power. You can carry out structure analysis, content analysis, column or fields analysis, pattern based analysis on most source systems including many DBMS, flat files, excels etc with readily available results in both numerical and visual representations to make you get a better sense of your data.

I believe all Data Quality tools are (or should be) equipped with good data profiling capabilities, most ETL vendors have data profiling capabilities and some data analysis packages like QlikView can also be used albeit in limited ways to profile data in limited time.

The Data Profile can also be later shared with the customer as a value deliverable.

 

Happy Demoing!

Data Discovery – The BI Mojo

Gartner’s Q1-2011 Magic Quadrant for Business Intelligence was recently released.

Without much surprise, the four quadrants hosted some of the best BI offerings. As expected, QlikTech moved to the Leaders’ Quadrant thanks to its growing customer base, bigger deployments and a successful IPO back in October last year.

Other players also shone, inlcuding the likes of Spotfire (TIBCO) and Tableau earning the challengers title. This is what we see a trend of the Magic Quadrant, no vendor directly moves to the Leader’s box without entering the Challengers zone first. It is well expected that sooner or later, Spotfire and Tableau will join the ranks of the leaders while it is also quite possible that one or two existing leaders might start fading in history.

The Zeitgeist:

Data Discovery tools have the greatest mind share, success and momentum. They have proved to be highly disruptive and have pushed aside slowly moving elephants aside. Although elephants might be able to dance, tools like Qlikview, Tableau and Spotfire represent the new wave of BI both from both adoption and approach perspectives.

These vendors are business friendly, analyst-savvy, agnostic to (traditional)reporting and have very agile development approaches. That is why the buying criteria are reporting to be

1. Ease of Use

2. Rapid Deployment

3. Functionality

These in-memory offerings compete on OLAP’s limitations and thus add a value addition to functionality, which is pretty much appreciated by IT as well.

However, this addition to the Leaders and Challengers quadrant by these new wave BI tools have caused a chain reaction resulting in SAP, Microsoft and Cognos innovating with their own in-memory offerings and interactive visual discovery tools. However, the post-2007 acquisition hangover lingers on and still customer dissatisfaction caused due to these acquisition and merger into larger product and services suite of the mega-vendors is the cause of concern for these players.

For these new wave BI tools, old adage problems are surfacing including Data Governance, Data Quality, Master Data Management, Single Version of the Truth and the curse of the information silos. Some of these new age vendors  are solving this by having a larger portfolio of products to cater to this, like TIBCO while others focus more on OEM partners to deliver these important facets, like QlikView, while still others rely on a symbiotic relationship with existing (traditional) BI deployments like Tableau.

The Observations:

  1. Both Traditional BI and Data Discovery tools are required, therefore, saturation in the Leaders Quadrant is far from reality while emergence of new vendors will still be observed.
  2. The overall BI maturity is being observed with the trend shifting from measurement to analysis to forecasting and optimization
  3. Cost is an increasingly important factor in purchasing and thus alternatives like open source offerings and SaaS deployments are gaining potential.
  4. Niche players will continue to flourish but need to have a viable road map amidst constant threat from mega-vendors to replicate or acquire (similar) technology.

Google Trends for Business Intelligence Today

Interesting find between traidtional giants, open source competitors and innovative new generation BI:

This slideshow requires JavaScript.

Clearly shows, Qlikview is gaining steady momentum, Pentaho is also gaining popularity, steady decline for traditional powerhouses…

Qlikview Section Access – Some Thoughts

Security, in BI…Is that a misnomer? I usually prefer the term Privacy instead…

Nevertheless, whatever term you use for the two processes, you still have to cater to authentication and authorization to see specific data.

QlikView, going enteprise, now has quite a mature security framework to comply with various standards including SOX, HIPAA, ISO either directly or through partners like NOAD.

This means that any company which needs to certify on these standards can be rest assured that QlikView follows compliance friendly and open standards for both authentication and authorization of data as well.

However, for those coming from other data security (privacy) frameworks, like that of traditional BI or ERP environments will find some familiarities in the patterns followed but also some differences.

The security patterns and How-To’s are very well documented by QlikView, one of the good documents can be found here..

Here, I’d like to highlight on the unusual way QlikView implements one of its security models, the Section Access. Of course, there are other ways to implement security which resemble traditional approaches using the QlikView Publisher but here I’d like to focus on a quite powerful security mechanism built within an app, called Section Access which serves a number of use cases for security implementations.

Section Access is a part of the load script which basically maps a list of groups/users with authorized fields/conditions and explicitly denied fields/conditions. As a causal phenomenon, using Section Access also ends up with Data Reduction, i.e. splitting up (and reduction) of data based on defined users with their granted and denied authorizations.

Some Problems:

1 – I hear people concerned about certain drawbacks or unexpected behavior with Section Access. First of all, there is a risk that a developer can lock him/herself out of the application if not being careful. Well, yes, there should be a failsafe mechanism to warn the user of doing so beforehand but then again, the idea behind Section Access is a self-securing, self-controlling Qlikview App, independent of a centralized environment responsible for data security (privacy), in a truly disconnected, democratic analysis (btw, still retaining single version of the truth) approach. The solution, use the document versioning built in feature in Qlikview 10

and roll back to previous versions of the app if this mishap takes place. Or, simply, take a backup of an app before implementing security (privacy) to it.

2 – Some people pointed out that it is quite insecure to define the security matrix (actually the data authorization matrix) within the script. Although samples and demos are pulling in the security matrix using INLINE data loads, in reality, the idea is to have data locked in a data store with authorization only to

Qlikview script to read it, and that the script be placed within a ‘Hidden’ script tab to avoid developers to get overtly curious or just accidentally curious.  The location of the actual data store can also be concealed by loading it in the Hidden script area. Of course, I am not asserting that it is totally failsafe either…

The idea behind section access for me is to add security (privacy) for disconnected analysis which is usually the norm of a few within an organization as compared to the bulk of the users who prefer the more vanilla cake ready implementations and work within corporate infrastructure to work with Qlikview applications. This can be handled by the more systematic and productive QlikView Publisher.

Bottom Line:

You have two types of security (privacy) mechanisms, the Qlikview Publisher, which is more enterprise friendly, centralized and more IT driven while the other is a unique feature which maintains the security context in disconnected analysis mode as well and also provides a self-defining, self-controlling DAR (or MAD) application.

Related Links:

http://qliktips.blogspot.com/2009/08/section-access-gotchas.html

http://www.quickqlearqool.nl/?p=822