How to fix 3G and Phone issues in Mobily for HTC Devices

While using certain HTC devices over Mobily (Saudi Arabia), users are not able to send or receive phone calls or SMS when using 3G. They usually receive Missed Call Notifications only.

Phone calls and SMS work only when 3G is turned off. This is because of two reasons. Mobily does not automatically set the baud rate of the device. And secondly, most HTC devices like the Kaiser come with 3.6 HSDPA which Mobily doesn’t recognize.

In order to use both Telephony and 3G data services simultaneously, do the following:

1. Under Advanced Phone Settings, Disable both HSDPA and HSUPA. Goto tab: CSD LineType, choose Data Rate 38400 bps (1.10) or later, connection element: Non_Transparent.

2. Under Phone Settings, Goto tab: Band, Choose Network WCDMA, and select Band for GSM/UMTS: UMTS (2100) + GSM(900+1800).

You should be able to enjoy 3G on HTC devices using Mobily.

Business Semantic Layers in BusinessObjects XI 3.1

A Business Semantic Layer in a Business Intelligence platform is a data layer which consolidates data for immediate business consumption. it allows for a standardized set of data definitions and business terms across the enterprise. It also allows business users to directly tap into their data sources without dealing with the complexities of data and database management.

Although not truly semantic (yet), the business semantic layer provides an interface to Business users to create their own reporting and analytics. The business semantic layer is usually developed by the data and information management teams or IT and is usually coupled with data privacy and organizational security policies.

There are two business semantic layers provided by BusinessObjects, the Business Views (developed through the client tool Business Views Manager) and the Universe (developed through the client tool Designer). The former comes from the acquisition of Crystal Decisions and the latter is its own home grown layer. It has been touted on several occasions by BusinessObjects of launching a unified semantic layer, however, it hasn’t arrived just yet.

The two layers offer many redundant functionality, however, both serve specific purposes. If you are planning to use Web Intelligence only, then you are left with no choice but to use Universes. However, if you are using Crystal Reports, then you have a fair deal of decision making to do before you can finally pin down to either Business Views or Universes or maybe even both.

Here are some Pros and Cons of both approaches for your decision making:

Secure Distribution:

Universes allow row level security. This means that users can define row level filtering based on group and user policies. However, for scheduling reports on an enterprise group security matrix, a single report has to be scheduled each time the row level security changes for the groups (or users).

This isn’t a very efficient approach, and when scaling out, it creates a lot more instances than when using business views. Business Views uses view level security. This means user can define business views for each unique group (or user). The advantage is that the report instance remains just one for each unique group(user) having a distinct data security.

Aggregate Awareness:

Aggregate Awareness is the ability to understand dimensional hierarchies when rolling up aggregates. The capability to define dimensional hierarchies is only present in Universe. Business Views are more like flat structures and there is no real differentiation between objects in the structure. Unlike Universes, Business Views don’t have Measures, Details, Dimensions or Classes and they also don’t have support for hierarchical relationships. Therefore, aggregate functions are not optimized in terms of roll up.

This means that if there are three functions, sum(week), sum(month), sum(year), Universes re-use the aggregate values from lower grains and roll up to higher ones, whereas Business Views treat them all independent of their relationship to one another.

Business Contexts:

Like Aggregate Awareness, Business Views don’t support Query Contexts as well from Universes. This means that for multiple join paths between tables, Business Views don’t provide any information to optimize the query path to the SQL optimizer. This leads to lesser performance. However, business views seldom enter into loops due to the separation of layers for Business Elements.

Limitations of Using Universes with Crystal Reports:

Although Crystal Reports can read from both Business Views and Universes, there are known limitations of using universes as a data source for crystal.

Check out the limitations here

My recommendation to use Business Views or Universe comes from the basic usage of the two reporting products.

For operational reporting with limited metrics (in terms of aggregations etc) but more focus on information distribution, Business Views are preferred.

For more management and analytical reporting needs with relatively greater number of metrics and with lesser security groupings, Universes are better.

However, based on the above points, it all depends on the exact needs of the customer from the platform and the tools to really decide which way to go.

The future is a unified semantic layer from BusinessObjects…

Xcelsius Connectivity – FlyNet Services

This is a part of a mini series on the data connectivity aspects of Crystal Xcelsius dashboard tool, part of SAP BusinessObjects.

Xcelsius dashboards are Shockwave (SWF) files providing rich user interactions and visualizations. The product is a special RAD tool on top of Adobe Flash which focuses exclusively on building dashboards. It has a set of key dashboard visualization widgets including charts, graphs, trends, signals, gauges etc. It is extensible through Adobe Flex.

Xcelsius dashboards can be deployed in various manners and has several usage scenarios with typically both individual and large groups of consumers for a particular dashboard. One of the most important areas is the data connectivity features of the product.

This mini series talks in some detail about various connectivity mechanisms. Here I will talk about Flynet Services, a third party component which comes free as an express edition along with Xcelsius Engage and Enterprise licensed media.

Flynet, is simply a .NET web services generator, (WSDL + ASMX) which in usual circumstances is not a popular skill set among dashboard designers. WSDL, Web Services Description Language is an xml format for the interoperability of functionality provided by distributed services. Xcelsius dashboards can communicate to and fro with data stores through the web service interface. One can write custom code for communicating between the Xcelsius engine and the data stores using .NET, Java, PHP, etc but it requires dedicated developer skills.

Flynet services provide a handy tool to automatically generate web services code for deployment to a web server, usually Microsoft IIS (but can be deployed on an apache web server as well).

If you have already purchased Xcelsius Engage or Enterprise, the Flynet setup will be in the folder <Xcelsius Install Path>\AddOns\Flynet or <Xcelsius Install Path>\Connectivity\FlyNet

After installing FlyNet WebService Genrator and IIS..

Enter the license both in Xcelsius (Help->Update License Key) and FlyNet Web Services Generator (Help-> Enter New Product Key).

However,
If you enter an Xcelsius Enterprise License, you will get this message.
Enterprise License Issue

Make sure you have the Crystal Xcelsius Engage Server License. This particular Xcelsius version connects directly with data sources.

The Enterprise license only works for data connectivity with BusinessObjects enterprise and not diectly with live data sources.

The FlyNet Generator has a catch. It can only allow for as many ‘analytics’ updates as the number of CAL license available. i.e for a 10-CAL Engage Server license, Flynet servics will randomly update only 10 analytic widgets (copmonents) at one deployment. Therefore, it is not a viable solution for dashboards requiring data conectivity with more analytic widgets.

If this doesn’t suffice, you have the options of the Adobe LifeCycle Data Services (LCDS) or manual coding of webservices…

FlyNet WS Generator is a simple three tab and thus three step tool.

1. Web Service

Web Services

Enter a name, preferrably without spaces (since certain webservers dont deal with white spaces in URLs very well), description and a folder location where you want your generated web services to be placed. If you are planning to use IIS and want to deploy the web services using FlyNet, make sure the folder is under the right security domain in the production system. Otherwise, you can generate the web service to any folder and then manually deploy it as a web app in IIS by unchecking the option: “Register Web Service with IIS”

If you have a corporate web service deployment policy and/or you want to better organize your deployed webservices, you can use the advanced feature and determine a .NET namespace for your generated webservice code and also the WS namespace URL.

Advanced Setting_WebServices

Once you have filled up this basic information, you can move to the next teb: “Data Source Connection”

2. Data Source Connection

This is where the beauty of this utility lies. You just have to define your query in SQL (with sligth variations) and you get yourself a generated webservice.

3


Click on New Data Source to define either an OLE DB or an ODBC connection. FlyNet provides data adapters for numerous data sources inlcuding RDMBS, OLAP cubes and even directly from excel files. However, it is kind of strange to use web services to connect to excel using FlyNet, Xcelsius provides XML mapping via excel already!

4

You can define the connection string either through the wizard or directly using the Advanced Button. Once completed, you can test your connection string and view the summary.

5

3. Queries:

6

Best way to define the queries is using the query wizard. The reason is that FlyNet for unkown reasons didnt write an ANSI SQL-92 or 2000 parser. For aggregates, stored procedures and case statements, there are special provisions.

8

For ‘Aggregate Values’, place ~ before and after the alias. For ‘Case Statements’, enclose in parantheses, for stored procedures, use the EXEC keyword.

10

Generate Web Service

Thats it! Generate the Web Service once you are done with your queries. Each query will lead to a new variable in Xcelsius. Ill come to that later.

The utility then generates a wsdl file, associated asmx and the web.config files for usage. Pity it doesnot reveal the c# code…

If you selected “Register Web Service  with IIS”, you will get a dialog asking you to view your webservice. That will open the URL in your browser.

Otherwise you can deploy the three files in IIS or Apache manually.

You can test your webservice by using the “Invoke” button. You should see the generated XML file with the relevant data in it. If things go wrong, the webservice will spit errors within the data elements of the xml.

11

Xcelsius

Now lets hook up the Xcelsius dashboard with the web service.

Goto the ‘Data Manager’ accessible from the toolbar. Click on Add->Add a Web Service Connection.

Enter the WSDL URL and click Import. This will import all the parameters into Xcelsius for linking purposes.

12

Notice each query you defined in FlyNet is converted to a method here. Thsi way you can build multiple input and output parameters. It also helps in running parallel queries. Designing the distribution of data to input and fetch using a multiple of queries is a pure design desicion and requires a lot of factors including the runtime performance, database performance and dashboard update frequency.

You can now link the webservice with the embedded Excel model by linking cells to parameters.

Once you are done with this, you can design your dashboard interface the usual way, linking cells with various widgets. Since these cells are now linked up with webservices parameters, your dashboards will be dynamically changing based on the source data!

Strategy by Measurement

Note: NullPointerException when accessing Dashboard and Analytics Setup (BOXI 3.0)

NullPointerException: Dashboad and Analytics Setup

When using the dashboard and analytics setup for the first time, you might end up at this stack trace screwup:

java.lang.NullPointerException
    at com.bo.aa.util.SecurityStore.GetLogonToken(SecurityStore.java:28)
    at com.bo.aa.util.SecurityStore.getSecurityValue(SecurityStore.java:79)
    at com.bo.aa.impl.DBServerImpl.getSecurityToken(DBServerImpl.java:166)
    at org.apache.jsp.jsp.appsHome_jsp._jspService(appsHome_jsp.java:536)
    at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:97)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:802)
    at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:334)
    at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:314)
    at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:264)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:802)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:252)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:173)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:178)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:126)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:105)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:107)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:148)
    at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:869)
    at org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:664)
    at org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:527)
    at org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:80)
    at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:684)
    at java.lang.Thread.run(Thread.java:595)

Reason:
This is caused since performance management needs a separate user account to function.

This is set in the file //businessobjects/performance management12.0  /initConfig.properties

Solution:
1. Create a new user name with no password, belonging to the administrator group and update initConfig.properties with this information.

2. Wihtin Central Configuration Manager, restart both Apache Tomcat and Server Intelligence Agent.

3. Login to InfoView using the new credential and Violla!

java.lang.NullPointerException
    at com.bo.aa.util.SecurityStore.GetLogonToken(SecurityStore.java:28)
    at com.bo.aa.util.SecurityStore.getSecurityValue(SecurityStore.java:79)
    at com.bo.aa.impl.DBServerImpl.getSecurityToken(DBServerImpl.java:166)
    at org.apache.jsp.jsp.appsHome_jsp._jspService(appsHome_jsp.java:536)
    at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:97)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:802)
    at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:334)
    at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:314)
    at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:264)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:802)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:252)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:173)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:178)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:126)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:105)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:107)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:148)
    at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:869)
    at org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:664)
    at org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:527)
    at org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:80)
    at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:684)
    at java.lang.Thread.run(Thread.java:595)

Data Requirement for Advanced Analytics

TDWI author, Philip Russom has presented a fantastic checklist on the data requirements for advanced analytics.

First, it is a major BI/DW organization which pinpoints the need of different data architectures for reporting and analytics (particularly advanced analytics).

Second, it serves as an important document for data warehousing and modeling experts who usually dont consider the advanced analytics usage when designing the data storage.

Third, it promotes the provisioning of separate analytical data stores that advanced analytics demand.

Fourth, it serves a business case for in-Memory databases.

Standard reporting and analytics (OLAP) suffice well with multidimensional models (high level, summarized data) while advanced analytics require raw transactional data (low level, detail data) along with aggregated data and derived data usually in denormalized forms. The exact nature of the design is determined on the type of analysis to be carried out.

The data integration is also different for data warehousing serving reporting and analytics and for the analytics databases serving advanced analytics. The former mostly rely on ETL while the later is better served up both in practicality and the nature of analysis by ELT.

Secondly, the data integration for data warehousing deals mostly with aggregating, consolidating and changing the schema type from relational to multidimensional. Whereas in analytics database, the data integration is of an advanced mathematical nature where activities like discretization of continuous data, binning, reverse pivoting, data sampling and PCA are heavily employed.

A similar discussion had been carried out sometime ago here.

This white paper makes a strong case.

BI Competency Centers – A Program Management approach towards delivering excelling BI

I recently came across a client whose need’s where genuinely overwhelmed by a bit of frustration and a lack of solution partners who are not just technology implementers. The company had consolidated a lot of software solutions from various vendors, with overlapping features and functionality and had multiple departments participating in a tug of war for taking their departmental (and individual) BI ventures, enterprise wide. Interestingly, there was the case of an orphaned BI initiative there as well, the sponsors of that project had gone elsewhere!

The company came up with the idea of an excellence center (or a competency center) to try to standardize people, processes and technology. Easier said than done, the whole concept IS highly Utopian and is usually touted as a single solution to this fairly universal problem. But to achieve this”excellence”, a lot of background work is required which besides being costly is also time consuming.

To start with, a vision of an excellence center has to be developed. First of all, what DO THESE TERMS REALLY MEAN? Excellence Centers, Competency Centers, Strategy and Delivery Groups etc.It is one of the curse of hypes but a fairly reasonable mapping exists……”Program Management”

A Business Intelligence Program Management which sees BI implementations not just as a technology with limited business benefits but a business initiated venture with a targeted growth plan providing further services, features, ROI and sanity has a very strong case to sell.

According to Gartner Research, “A BICC is a cross-functional team with specific tasks, roles, responsibilities, and processes for supporting and promoting the effective use of Business Intelligence across the organization.”

BI Projects should be looked as ongoing, cyclical and iterative BI processes providing an improved delivery at each iteration. A Competency Center can provide the framework for measuring BI projects and their implementation, it also lets the company experience the cultural and operational transformations taking place as a result of a systematic and pervasive BI establishment. However, considering the different organizational behavior at different sized companies, operating in various verticals in diverse cultural backgrounds cannot be a single, enlightening offering.

It has to be Tailored for each concern whether a corporate or a department. But in general, a few set of services are considered core to the BI concerns in a company, namely,

  1. The Periodic Assessment of ROI and Cost vs Benefits.
  2. The standardization of processes and technology, whcih includes an enterprise level integration infrastructure again for both business and technology.
  3. A well defined and controlled Risk Management perspective on the BI space.
  4. A carefully crafted Knowledge Management initiative including organizational change.
  5. A focused and prioritized agenda on Business User “Buy-In” into the BI environment.

Several companies provide their BICC setup and operations competencies and consultancies these days.  However, there aren’t many best practices or guidelines in choosing the right partner for establishing one. Minimum requirements could be the ability of execute BI projects and programs, strong Human Resources, Business Processes and Systems Integration skills etc.

Although BICCs are ongoing programs, they should be highly target oriented. These milestones and performance targets are based on various assessment calculators which usually come as part of a BICC setup.

A very creative way to visualize the progress and understand the whole philosophy behind the BICCs is wonderful BI Maturity Model for demonstrating the characteristics of a BI program or project, developed by TDWI.

There is also a fairly detailed book on the topic of establishing and developing a BICC, published from SAS and Wiley and Co, Titled:

“Business Intelligence Competency Centers: A Team Approach to Maximizing Competitive Advantage (Wiley and SAS Business Series) by Gloria J. Miller, Dagmar Brautigam, Stefanie V. Gerlach

Although the book is written by one of the BICC consultancy firms, the ideas presented are applicable universally. Their interpretation of the core services offered (or should be) by a BICC have been widely adopted by both the industry and the academia.

Source: Business Intelligence Competency Centers, a Team Approach to Maximize Competitive Advantage" SAS and Wiley Co.

Source: Business Intelligence Competency Centers, a Team Approach to Maximize Competitive Advantage" SAS and Wiley Co.

All of these services are interrelated and each serves as an input to others. Each service also serves more than one goal of the BICC.

For example, the Advanced Analytics service besides providing a greater usability of BI and its infrastructure also increases the ROI. It also presents a strong case for evangelizing BI. It gives the business users an insight on what CAN happen from your BI environment. For organizations not having a sound infrastructure in place, an aggressively advertised advanced analytics service can form the motive to invest in a holistic enterprise information architecture, for example.

Establishing a BICC is a highly subjective matter and varies substantially from case to case. However a template based road map can be followed as one provided in the referred book. Primarily it depends on the existence of a similar setup already in the company, the maturity of the company in terms of its processes and policies for change management and technology, the type of people in terms of domain expertise and skill levels, the budget and time constraints etc.

As part of a general best practice, it is ideal to grow the BICC organically, meaning from bottom up with sponsorship from the top. A departmental wide BICC prototype which is planned for the enterprise but services one smaller concern at a time, like a department and then growing gradually into covering more departments and offering richer services.

Having a centric approach towards managing the concerns of BI is a daunting task but has its dividends promised if done well. The success of BI projects heavily rely on their continuity, reliability, flexibility, visibility and scalabiilty. BICCs offer just that.

The TDWI BI Maturity Model

As most industry experts tout the ongoing nature of Business Intelligence projects, there comes a natural desire to rate the status quo BI in an organization. However, as wide and diverse a Business Intelligence implementation is in terms of the tools, processes, people and culture invovled, it does need an overall benchmarking to assess future directions by setting goals and understanding the shortcomings of current offerings. To measure the ‘Maturity’ of an implementation, several independent organizations and certain vendors have developed their own assessment models. Among such publicly available models, the TDWI’s BI Maturity Model is a top down, vision oriented model which organizations can use to develop a road map. The model is a generalization of multiple BI projects and implementations indicating certain patterns of behavior based on five different aspects:

1. The BI Adoption

2. Organization Control and Processes

3. Usage

4. Insight

5. ROI

Such a maturity assessment is important in terms of gauging the value of the business intelligence inititave. For systems integrators and consultants its serves as a guide to set project milestones, deliverables and management and for C-Level execs to understand a step by step guide to the ROI from BI investments.

The TDWI BI Maturity Model offers a framework to adjudge the current standings of a BI implementation in terms of its adoption, control, usage, insights and finally the ROI.

An associated poster is presented here taken from Timo Elliott’s blog:

Each graph has its own target audience within the stakeholders and serves as a guide to a particular agenda. The management might be more interested in the last graph, the business value and ROI, whereas the business analysts might be more interested in the Insight whil the program implementers will focus on the adoption and usage as their primary concerns.

The Gaps:

While advocating the value of Business Intelligence, it is imperative for organizations to understand the gaps which obstruct their progress, either caused by the management or by the prevailing culture of the organization. It is the crossing of these gaps that defines the change management agenda for the program.

In the BI adoption cycle, the gaps define the points of stagnation which requires consultancy and a self motivated drive by the organization to cross it. But in other cycles the gaps indicate paradigm shift, state transition and time to reap the rewards of the investment. For the Local control vs enterprise standards, this indicates a well placed Business Process Management occasionally coupled by Data Governance initiatives which take over adhoc factors within organizations. In the first gap called the gulf, it is the individuals who feel empowered mostly by self-service BI capabilities but it kind of stops at that until and unless the organization throttles to move towards the second gap, called the chasm where it finds a mutual agreement between the individual entities of control and the corporate governance and management practices which leads to organization enlightenment, becoming a ‘sage’.

For the BI Usage, before the gulf, the organization is equipped with its first batch of power users which identifies freedom from their IT department to provide insight. But here is where the gap occurs. A group of established power users indirectly in control of the BI program lets the organization stagnate and miss out the ability to truly empower all their business users and not just power users. Although an ideal utopian state to accomplish, but it has been the approach of wise men, there is no end to gaining wisdom and becoming a sage. To overcome this chasm, organizations further enhance their incentive systems for business users while providing the next layer of BI services: the customization capabilities to supplement self-service. Here tools become pervasive to the organization staff and their throughput increases.

For BI Insight, the phases between chasms indicates the shortening of the decision making process. Once the second and final chasm is crossed, organizations gain the capability of automated decsison making and a system which supplements business users with true decision support.

Programs such as BI require a dedicated sponsorship from the management to prosper and when they do take interest, expectations of ROI might supersede their actual values. This is partly caused by phenomena well explained in Gartner’s Hype Cycle.

This poster aids the BI teams to better explain the ROI expectations to the management in charge of the programs. What you sow, so shall you reap system applies to BI programs as well.

The model is well explained by the author Wayne Eckerson and certain blogs supplement the material as well.

Some notable links:

1. TDWI BI Maturity Assessment Tool

2. You can’t get there from here!

3. Hows your BI Maturity

Gartner 2009 Strategic Technologies

Although a slightly old post, but here is an interesting article by Gartner on the 2009 Stratgic Technologies. A motivation to relook at this Oct, 2008 vision a quarter later reveals the accuracy of the vision.

2009 began with a downturn economy and sales strategy focusing on fixing things, a rather reactive approach but in the world of information technology, things are getting better. Lots of convergence, huge strides in maturity, increase in motivation and an aggressive roadmap.

Gartner presents the top 10 Strategic Technologies enlisted here:

Virtualization
Cloud Computing
Beyond Blade Servers
Green IT
Web Oriented Architectures
Enterprsie Mashups 2.0
Specialised Systems
Social Software and Social Networking
Unified Communications
Business Intelligence

My understanding of the list actually streams Gartner’s choices into three:

Infrastructure
NextGen Applications
Business Intelligence

Infrasructure:
Virtualization is a massive stride forward in server consoildation and to an extent lowering licensing costs of software for organisations. As the usage of virtualization increases within companies, the need for virtualization management and security has increased. Enterprises already at a mature state of virtualization will further focus on collaboration of their virtual platforms with existing physical infrastrcture, more invisibility of virtualization on networks and capabiliites to take snapshots for cloning physical servers arrangements and configurations including software.
Pain points for the year will be virtualization security as the need for bringing virtualization into mainstream environments will reveal aspects of security specific to this line of technology.

Cloud Computing will further SaaS models and the business model’s appeal will increase in emerging markets like the Middle East and the Far East. This is coupled by the reason of increased investments in telecommunication backbones and greater awareness of outsourcing IT maintaince to service providers offering services on the cloud.
However, there are many pain points ranging from raising costs of telecom and sporadic skepticm in TCO of the SaaS business model in markets like the Middle East. Pain points remain the height of exceeding expacations and lack of best practices to adopt for most organizations.

Green IT and Blade dissappearance can be duly served by the changing trends in the software consupmtion behavior from products to services) where service providers determine these initiatives. However, Green IT  will not be adopted widely during this continued recession phase.

Next Gen Apps:
SOA is reaching its plateau of productivity on the Gartner’s Hype Cycle and will further enterprise mashups. This technology has already given a direction to evolve web standards and architectures influenced a wider range of application connectivity leading to benefits harnessed by other technologies and applications including social networking applications, collaboration and business intelligence. However, with the emerging trends of converged networks, pervasive computing applications including location aware, embedded systems will increase as well. Companies already having a level of this will continue to invest in application semantics using BPM automation, service orchestrations and semantic web services. Certain vendors have already started to bring forward their product offerings in these areas.

Business Intelligence:
It is prime time for BI to flourish, already aggressively growing in 2007 and 2008 according to both Gartner and TDWI, BI makes its justifaction for greater compliance, visibility, transparency, something whcih the post-recession period demands. This period also serves the motivation for organsiations to focus on their performance management. Business Intelligence offerings has a wider portfolio to offer this year with many vendors offering (or acquiring) data management appliances including Teradata, Greenplum, Microsoft etc. Last year vendor consolidation has brought greater strength to individual portfolio of each BI provider. SaaS models are also avaiable as alternatives giving customers much flexibility and even possiblity to mature their BI initiatives. Costs of implmementing Business Intelligence will go down for organisations with experience in consuming SaaS and those ready to invest in open source BI which has reached impressive maturity.
However, Predictive Analytics is still to go mainstream this year but is a probable reality in coming years.

Organizing Life 2.0 – A brief comparison

Jittering the Nitty Gritties, the mundane details, the crosses, the hashes, scrapping it and back to the drawing board. This is the usual activities of anyone taking notes and trying to bring structure to chaos. There are several theories and techniques out there to survive life 2.0 and many man hours have been spent by many men trying to figure out the system best for him. For all the sexists, let me be clear, I believe women are better organised and they can manage multiple tasks. But men, have to use one of the many available artificial systems to get back their control on life. I managed to prune down such systems to three, close to nature.

The ThinkingRock software supporting the GTD methodology, the FreeMind software supporting Mindmaps and MS OneNote supporting well…collaborative notes. TaskJuggler came as a close fourth on personal taste but I have the perception of it being too geeky for the general audience to catch the concept. All three of these software alongwith their methodologies have individual strengths and weaknesses and these are subjective based on interests, one’s educational and professional background and capabality of usage. although all three are pretty intuitive and takes no time to get going, there are several opposing communities of users whose preferences of their choices conflict with one another.

Here I will present to you my perspective of how I organise myself better or just perceive to be better organised!

1. FreeMind (Free):

Mindmaps were used by people as early as Aristotle as a way to represent things immediate to mind. Psychologists say that on average, our mind can keep 7+-2 concepts in mind at a particular time, sort of saying our cache can hold that much concepts. Some of us find ourself stressed out by the burden of having more than 9 items simultaneously which results in stress, incorrect judgement and inconsistent decisions. Mindmaps is a simple, intuitive way to organise concepts immediate in our minds in a tree-like structure whose depth can be controlled depending on our context. Here is a typical mindmap made in FreeMind, an opensource tool which provides many rich features than anyother commercial mindmapping tool out in the market.


2. Microsoft OneNote:

Microsoft introduced OneNote as part of their Office Suite since 2003 and while it gained popularity in Office 2007 onwards due to the tighter integration with Outlook and Word and also due to the licensing and distribution changes (now ships with standard Office Suite), it still remains to achieve a regular membership  of the Office family for years to come. The strong point of OneNote is its real time collaborative features which gives it a shared whiteboard feel which can accommodate most media types, text, images, video, audio, Office objects (visio shapes, excel sheets etc), handwriting (for Tablet PC) and a good flexibility for use the writing area like a physical scrap pad. What it lacks though is a systematic structure of representing information which can be good at some scenarios. Unlike FreeMind or ThinkingRock which are backed by particular knowledge representation schemes, OneNote is for the free souls to use as they please.

This approach suits many individually but cant be relied upon in team based project sharing and collaboration. Although OneNote pretends to present well organized templates, it actually does not do much more than enter default bulleted “flat” text.

3. ThinkingRock (Free)

This is a very well made software following the Getting Things Done GTD approach of David Allen whose main mantra is context. Our daily routines see different contexts which includes our location, our moods, our energy to do different type of work at different times of the day. This adds up to the philosophy as used in Mind maps as well that the less thoughts one can have at a particular time, the more creative and productive he/she can become.

ThinkingRock automatically hides all tasks and thoughts not in one’s current context and allows a self-adaptive task priority utility in which least prior activities after some time automatically become activities to complete ASAP. This would let one to eventually complete all tasks regardless of priority and not forget even the smaller things in life.

As a PIM (Perosnal Information Management) tool, ThinkingRock is a clear choice over the other two but as a single point of reference for managing thoughts, scraps, and time, OneNote and FreeMind can be used instead. For teams working on collaborative work, there is no comparision to the features offered by OneNote. In essence, to use the best of breed, one has to use atleast two of these products simultaneously until their intergration is developed. There is already some collaborative features available on FreeMind and the development is very active which is a sign of better things to come. This gives an edge to FreeMind over OneNote, while ThinkingRock can be used solely as a PIM.