The information overload

Research suggests that, collectively,
employees of the average Times 1000 company spend more than 600,000 person-hours each year
simply trying to access information. Another study, by Scribe Technologies, translates
this into financial terms, finding that UK companies alone lose around £4.7 billion a
year because their employees cannot obtain the information they require in a timely and
efficient way.

A further issue is whether this information
is even available in a suitable form in the first place. In many organizations, a
significant percentage of staff spend their entire time re-keying data for use in
spreadsheets or other applications that support the business. By any standards, this
constitutes an inappropriate use of skilled and costly resources.

Against this background, it’s hardly
surprising that organizations will welcome any measure that promises a dramatic fall in
the costs of obtaining the correct information when it is needed. Recent advances in IT
have made it far easier for employees to make maximum use of the wealth of information
that already exists within their organizations.

Data rich but information poor

For some years, IT departments have
concentrated on providing efficient and fully functional transaction processing systems.
As a result, many have built large databases that could prove immensely useful to the
organizations concerned if only they could access the data effectively. Thanks to advances
in software and hardware, the concept of business intelligence for the masses is now a
reality.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

Business Intelligence Planning (known as
BIP, or simply BI) is a term coined by industry analysts The Gartner Group. It is used to
define the analysis of information from databases in order to improve the decision-making
process.

Just a few years ago, most organizations
had relatively few PCs. Their specification was laughably basic by today’s standards,
with 386/486 processors, 250Mb hard drives and 4Mb or 8Mb of memory being the norm. Now an
entry level PC is likely to boast a 266MHz Pentium II processor, a minimum of 32Mb of
memory and a 4Gb hard drive. Couple this with the latest Graphical User Interfaces (GUIs),
networking capabilities, server technology, e-mail and the internet, and all the hardware
ingredients required for a successful BI implementation are within the reach of
everyone’s desktop.

However, the hardware on your desk isn’t
the only factor. Before purchasing any BI software, it’s important to understand the
various types on offer. For example, does your organization require report writing,
On-Line Analytical Processing (OLAP) analysis, data mining, or an integrated suite that
combines all of these and more?

Reporting on the business

Traditionally, report writing has been the
most popular method of producing information for an organization. With today’s
sophisticated reporting tools, the report writer has become far more than just a printing
mechanism.

The ability to provide a report directly to
a computer screen is the most obvious improvement in modern day report writers. However,
these tools also now enable you to utilize e-mail, web reporting for internet and intranet
deployment and – probably the most useful benefit – provide the ability to link reports
together intelligently, enabling exception reporting.

Summarized reports can be produced, with
the user able to simply click on an item of interest in order to execute a more detailed
report on that subject. Technology such as this eliminates the need to produce thousands
of pages of paper, of which only very few may be of real importance to the business. A
valuable benefit is therefore the reduced computing load on the transaction processing
system.

On-Line Analytical Processing
(OLAP)

The real world is not two-dimensional,
which is why OLAP systems have become increasingly popular to provide users with a
multidimensional approach to their data analysis. Most people are familiar with the
two-dimensional format of spreadsheets – or possibly even three dimensions, when
worksheets are used. In reality, though, people almost certainly need to analyze data in
more than three dimensions, to provide a ‘real world’ view that enables data to
be explored and analyzed from an enormous number of perspectives. This is where OLAP tools
come to the fore.

Most OLAP tools provide a very powerful GUI
that enables people to ‘slice and dice’ their data to perform complex analysis
on specific areas of interest. OLAP data is normally extracted from a transaction
processing system or centralized repositories of data known as data warehouses, and can
therefore exist in several formats according to the volume of data involved. The data is
stored in a multidimensional database (MDDB), otherwise known as an OLAP cube.

Desktop OLAP (DOLAP) applications give
people with small data volumes the ability to store data locally on their PC; this is
often called Multidimensional OLAP, or MOLAP. More scaleable MOLAP solutions allow data to
be stored on an NT or Unix server, with the user’s PC acting as a client, or access
medium.

Computer server technology enables OLAP
models to be widely shared among many users across a network. The processing power of the
server performs the ‘number crunching’ of the data, enabling users to have much
lower specification desktop PCs – these now become the viewers of the data, rather than
the processors. Such an approach helps to fully utilize the capability and capacity of a
networked computer infrastructure.

In very large OLAP installations, such as
those that administer loyalty and reward schemes in retail organizations, companies may
choose to deploy a data warehouse. A more scaleable Relational OLAP (ROLAP) tool may then
be implemented above the data warehouse to extract key information that can make a real
impact on the business.

Balancing the business

One of the latest initiatives in BI,
balanced scorecards allow an organization to define specific business measures, which can
be compared and consolidated to gain a high level overview of overall business
performance. Well-balanced scorecard systems will incorporate a high quality GUI and
provide a ‘drill-down’ capability to focus on trends and to analyze problem
areas within the business.

Mining your data

When handling large volumes of data, trends
and patterns can exist that are not immediately apparent when performing a manual
analysis. Data mining applications not only produce information that the user may have
requested specifically, but can be trained to identify obscure patterns and relationships
by mimicking human reasoning.

Data warehousing

In certain circumstances, it may be
advisable to deploy BI tools in a layer above a data warehouse. A data warehouse is a
repository of data that is created from the ‘day-to-day’ transaction processing
systems, and which is kept separate from them in order to ensure the integrity of the
operational data. Typically, a data warehouse contains information consolidated from a
variety of data sources, such as head office systems from different countries and external
industry analysis.

Data warehouses are often held on a
different hardware platform and in a different database that is better suited to the
requirements of warehousing. One of the trends in data warehousing is the growing use of a
Windows NT server as the host platform. Most database vendors provide NT versions of their
databases – for example, Oracle, IBM with Universal Database (UDB, previously known as
DB2) and Microsoft with SQL Server. Windows NT also provides a seamless platform for
graphical BI analysis tools, as most are Windows-based.

In most cases, data within the warehouse
will have been redesigned, ‘cleansed’ (for example, duplicates having been
eradicated) and aggregated to make it easier to use BI tools. A data warehouse also helps
to reduce the computing load on the main transaction processing system.

Deploying BI applications

Once you have identified which types of BI
application your organization requires, you can plan the deployment of the solution. Here
are some key factors to consider. Which users will have direct access to the source
database? Which users will be permitted only to view the data? Are you going to load BI
tools on to each user’s desktop PC? Are you going to provide access to the data via
the internet/intranet using a web browser? It’s important to understand the
implications and ongoing costs of maintaining each of these options before choosing a
suite of tools.

At a high level, small organizations may
not have the infrastructure or technical knowledge to set up an intranet, and desktop
tools will be more appropriate. Large organizations, however, may select the intranet
option, as it is more cost-effective to have a single, central software repository from
which users update their PCs, rather than keeping hundreds of individual PCs updated with
the latest releases of multiple software applications.

The internet and intranets

Until recently, the enterprise-wide
deployment of BI solutions was a problem for many organizations. Not only has the cost of
ownership been very high but, as BI users strive to keep their PCs upgraded to the latest
releases of hardware and software, there are key administrative implications. In addition,
data extracted from source systems has to be transferred to multiple PCs so that users can
analyze it.

However, everyone with a Windows 95/98/NT
desktop PC now has access to the Microsoft Internet Explorer web browser, which provides a
means of accessing web sites. The Netscape Navigator product is a similar browser
solution. Therefore, as BI software becomes increasingly web-enabled, people actually need
very little software on their PC. They can operate via a web browser after downloading
additional applications such as Java or ActiveX components, and access data and software
through a web server.

Some managers are concerned at the ease
with which information can now be accessed over the internet. However, in reality, this is
rarely an issue. Most solutions of this type are deployed over an intranet using an
internal network of PCs, while internet installations are usually implemented on a secure
web server.

Metadata: data about data

When deploying BI solutions, one of the
biggest tasks is the development of metadata above the source applications. All BI tools
need to understand the underlying database structure of their source system. The
definition of this database structure is known as metadata, although most BI tools have
proprietary terminology such as ‘knowledge bases’, ‘catalogs’ and
‘universes’. The need to provide metadata has resulted in the creation of more
‘shelfware’ than in any other area of BI implementations.

When choosing a set of BI tools, it’s
extremely useful if the metadata already exists between the tools and your source
applications. If it doesn’t, you will probably have to develop it yourself. However,
this is normally a lengthy process and requires detailed knowledge of the source
application database.

Gaining an edge

There has never been a better time to reap
the benefits of Business Intelligence solutions. As projects related to year 2000 (Y2K)
compliance near completion, you have an excellent opportunity to deploy solutions that
make cost-effective use of the extensive data held in your Y2K-compliant applications.

Computer hardware now has the processing
power to handle the data required, and a PC/server/network infrastructure is in place in
most organizations. Coupled with the right choice of BI application, all of the key
components are in place to help you stay ahead of your competitors by turning raw data
into priceless, business-critical information. With far more efficient use of
employees’ time into the bargain, significant financial savings are an added bonus.

Mark White, JBA

Mark White has 11 years experience in
the IT Industry with a particular focus on the financial sector. Mark has spent the last
four years consulting on and implementing Business Intelligence solutions.