0% found this document useful (0 votes)
23 views12 pages

Cognos Auditing - Tips and Tricks For Large & High Volume Environments - Motio

The document discusses tips for improving performance of the Cognos audit database in large environments. It recommends separating audit data into a live database containing the most recent week's data and a historical database for older data. Indexes on key columns are also suggested to optimize query performance.

Uploaded by

leonardo.russo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views12 pages

Cognos Auditing - Tips and Tricks For Large & High Volume Environments - Motio

The document discusses tips for improving performance of the Cognos audit database in large environments. It recommends separating audit data into a live database containing the most recent week's data and a historical database for older data. Indexes on key columns are also suggested to optimize query performance.

Uploaded by

leonardo.russo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

a

Cognos Auditing – Tips and Tricks for


Large & High Volume Environments
by John Boyer | May 17, 2021 | Cognos Analytics, Cognos Performance, Cognos
Troubleshooting, ReportCard | 0 comments

A blog by John Boyer and Mike Norris.

Introduction
It is important to have the Cognos Auditing capability working to know and understand how
Cognos is being used by your user community and help answer questions like:
Who is using the system?
What reports are they running?
What are the report run times?
With the help of other tools, like MotioCI, what content is unused?

Considering how critical it is to maintain healthy Cognos Analytics environments,


surprisingly little has been written about its auditing database beyond the standard product
documentation. Perhaps, it’s taken for granted, but organizations that use it know that over
time querying the Audit Database tables will start to slow – especially if your organization
has a lot of users running lots of reports and has lots of history. What’s more is that the
audit activity logging itself may be delayed because it is being queued when it can’t be
added to the database quickly enough, for example. That’s when you start to think about
database performance as you would with any operational database that has reporting
requirements.

Large tables typically slow query performance. The larger the table, the longer it takes to
insert and query. Remember that these tables and the Audit Database are basically an
operational database; writes are happening frequently and work against us as we can’t
focus them for only read operations as you would with a data mart.

Much like the content store, the health of the Cognos environment must also take into
account the health of the Audit Database. Unbounded growth of the Audit Database can
become an issue over time and may eventually even impact the overall performance of a
Cognos environment. In many organizations with external regulations thrust upon them, not
having a full audit record can land them in a non-compliance situation with heavy
repercussions. So how do we deal with having to maintain so much data for historical
auditing purposes — in some cases up to 10 years — yet still get the reporting we need to
maintain the environment and keep users happy with the performance?

The Challenge
Unbounded growth of the Audit Database is negatively impacting the health of the
Cognos environment
Reporting off the Audit Database has become slow or unusable
Cognos experiences delays in records being written to the Audit Database
The Audit Database is running out of disk space

All this means that it’s not just the reports which rely on the Audit Database which suffer,
but often the entire system. If the Audit Database is on the same server as the Cognos
content store, performance of all things Cognos will be affected in that environment.

The Setup
We assume:

1. Cognos Analytics is installed and running


2. Cognos is configured to log to an Audit Database
Have an Audit Database in place
Set appropriate Audit logging levels in Cognos administration
Record are being written to the database by Cognos

3. The Audit Database has been in use for more than a year
4. The environment is very active with users and executions
5. The Audit package is being used to surface Cognos usage data
6. We are looking to improve Audit Database reporting performance
7. Starting over or deleting old records is not always an option

If you don’t, yet, have Cognos Audit installed and configured, Lodestar Solutions, a Motio
partner, has an excellent post on enabling Audit in Cognos BI /CA.

The Solution
There are some possible solutions that quickly present themselves:

1. Reduce the volume of data by:

Moving some of the older data to another database


Moving some of the older data to another table in the same database

2. Just delete or archive some of the data and don’t worry about it
3. Live with it. Kick the can down the road and push the Database Administrator for
performance
improvements while handcuffing them by not allowing alterations of the schema or
indexes

We’re not going to deal with option 3. Option 2, deleting the data, is not a good option and
I’d recommend keeping at least 18 months’ worth at a minimum. But, if you’re so inclined,
IBM provides a utility, AuditDBCleanup (Cognos BI) or a script (Cognos Analytics) which will
do exactly that. The utility for Cognos BI deletes records based on a timestamp while the
scripts for Cognos Analytics just delete the indexes and tables.

The recommendations we’ve made to clients previously on this were to separate into two
databases:

1. Audit – Live : contains most recent week’s worth of data


2. Audit – Historical : contains historical data (up to N years)
In short, the process runs weekly to move most recent records from Audit Live to Audit
Historical. Audit Live starts over as a blank slate after this process runs.

1. The Live DB is fast and tight, allowing inserts to happen as fast as possible
2. Audit queries are exclusively directed to the Historical DB

Using this approach, there is no implicit “stitching together” of the Live data and the
Historical data. I would argue that you probably want to keep it that way.

In Cognos Administration, you can add two different connections for the Audit Data Source.
When a user runs a report against the Audit package, they get prompted for which
connection they want to use:

On the off chance you want to look at live audit data rather than historical audit data, you
just pick the “Audit – Live” connection when prompted (should be the exception, not the
norm.)

If you REALLY also want to provide a consolidated view of both Live and Historical, you
could do so, but it would impact performance.

For example, you could be create a 3rd Database called “Audit – Consolidated View” and
then , for each table in the Audit schema: create an identically named view that is a SQL
union between the table in the live DB and the table in the historical DB. Similarly, this could
also be achieved in the Framework Manager model, but, again, performance would be a key
consideration.

Some of our clients have created a consolidated view. It is our opinion that this is likely
overkill. Performance would always be worse in this consolidated view and we haven’t
come across many use cases which use both the Live data sets and Historical. The Live
being used for troubleshooting and the Historical for trend reporting.

As of Cognos Analytics 11.1.7, the Audit Database has grown to 21 tables. You can find
more information elsewhere on the Audit Database, sample audit reports and the
Framework Manager model. The default logging level is Minimal, but you may want to use
the next level, Basic, to capture use requests, user account management and runtime
usage. One way you can maintain system performance is by keeping the logging level to the
lowest level required. Obviously, the more logging that is done by the server, the more
overall server performance can be affected.

The key tables most administrators will be interested in are the 6 tables which log the user
activity and reporting activity in the system.

COGIPF_USERLOGON : Stores user logon (including log off) information


COGIPF_RUNREPORT : Stores information about report executions
COGIPF_VIEWREPORT : Stores information about report view requests
COGIPF_EDITQUERY : Stores information about query runs
COGIPF_RUNJOB : Stores information about job requests
COGIPF_ACTION : Records user actions in Cognos (this table may grow much more
rapidly than the others)

The out-of-the-box configuration looks like this:

Recommended configuration:

The Cognos Audit Database – Live contains 1 week of audit data. Data older than 1 week is
moved to the Cognos Audit Database – Historical.

The line from the Cognos Audit Database – Live to Cognos Audit Database – Historical in
the diagram is responsible for:

Copying data from Live Audit to Historical Audit


Remove all rows in the Live Audit that are older than 1 week
Remove all rows in Historical Audit that are older than x years
Remove all rows in COGIPF_ACTION that are older than 6 months

Indexes
Different database types have different indexing types. A database index is a data structure,
associated with a Table (or View), used to improve queries execution time when retrieving
the data from that table (or View). Work with your DBA to create the optimum strategy. They
will want to know the answers to questions like these to make the best decisions on what
columns to index. Obviously, the database administrator could find out the answers to some
or all these questions without your help, but it would take some research and some time:

How many records do the tables have and to what size do you expect them to grow?
(Indexing a table will not be useful unless the table has large number of records.)
Do you know which columns are unique? Do they allow NULL values? Which columns
have data type of integer or big integer? (The columns with numeric data types and that
are UNIQUE and NOT NULL are strong candidates to participate in the index key.)
Where are your main performance problems today? Are they in retrieving the data? Are
there specific queries or reports which are more of a problem? (This may lead the
database administrator to some specific columns which can be optimized.)
What fields are used in joining tables for reporting?
What fields are used for filtering, sorting, grouping, and aggregating?

Not surprisingly, these are the same questions that would need to be answered for
improving the performance of any database tables.

IBM Support recommends creating an index on columns “COGIPF_REQUESTID”,


“COGIPF_SUBREQUESTID”, and “COGIPF_STEPID” for the following tables to improve
performance:

COGIPF_NATIVEQUERY
COGIPF_RUNJOB
COGIPF_RUNJOBSTEP
COGIPF_RUNREPORT
COGIPF_EDITQUERY

Plus on other less-used tables:

COGIPF_POWERPLAY
COGIPF_HUMANTASKSERVICE
COGIPF_HUMANTASKSERVICE_DETAIL

You can use this as a starting point, but I would go through the exercise of answering the
questions above to arrive at the best answer for your organization.

Other Considerations
1. Audit FM Model. Remember that the Framework Manager model which IBM provides is
modeled on the default tables and fields. Any changes you make to the reporting tables will
need to be reflected in the model. The ease or complexity of these changes – or your
organizational competence to make these changes – may affect the solution you choose.
2. Additional fields. If you’re going to do it, now is the time to add additional fields for
context or reference data to improve audit reporting.
3. Summary tables. Instead of just copying the data to your historical table, compress it.
You could aggregate the data to the day level to make It more efficient for reporting.
4. Views instead of tables. Others say, “So, instead of having a ‘current’ database and a
‘historical’ database, you should only have one database, and all tables in it should be
prefixed with ‘historical’. Then, you should create a set of views, one for each table that you
want to see as ‘current’, and have each view filter out the historical rows that you do not
want to see and let only the current ones pass through.”
https://softwareengineering.stackexchange.com/questions/276395/two-database-
architecture-operational-and-historical/276419#276419

Conclusion
The bottom line is that with the information provided here you should be well prepared to
have a productive conversation with your DBA. Chances are good that she has solved
similar problems before.

The proposed changes in Cognos Audit Database architecture will improve performance in
both direct reporting as well as 3rd-party applications which rely on it, like Motio’s
ReportCard and Inventory.

By the way, if you’ve had that conversation with your DBA, we’d love to hear about it. We’d
also love to hear if you’ve solved the issue of a poorly performing Audit Database and how
you did it.

Search

Recent Posts
Cognos Auditing – Tips and Tricks for Large & High Volume Environments

Baker Tilly Trusts Soterre with Audits

Persona IQ Securely Migrates HealthPort’s Cognos Authentication

Telus Increases Speed and Efficiency of Cognos Releases

CIRA Transitions to an Agile Methodology with MotioCI

Recent Comments

Archives
May 2021

April 2021

January 2021

December 2020
November 2020

October 2020

September 2020

August 2020

June 2020

May 2020

April 2020

March 2020

February 2020

January 2020

October 2019

August 2019

April 2019

March 2019

January 2019

October 2018

August 2018

June 2018

March 2018

October 2017

May 2017

March 2017

February 2017

January 2017

December 2016

September 2016

June 2016

May 2016

March 2016

February 2016

January 2016

September 2015

July 2015

June 2015

May 2015
April 2015

March 2015

February 2015

December 2014

October 2014

September 2014

July 2014

July 2013

June 2013

January 2013

December 2012

August 2012

July 2012

June 2012

April 2012

March 2012

January 2012

November 2011

April 2011

March 2011

January 2011

November 2010

Categories
AGNI for TM1

Blog

Business Analytics

Business Intelligence

Case Studies

Cognos Analytics

Cognos Authentication Migration

Cognos Performance

Cognos TM1

Cognos Troubleshooting

Education
Financial Services

Healthcare

Hug a Cognos Admin Day

Insurance

Mass Update

Mass update Cognos

Motio

MotioCI

MotioCI

MotioPI

Persona IQ

Planning Analytics powered by TM1

Press

Qlik

Qlik luminary

Recover deleted Cognos

ReportCard

Security

Series 7 access manager

Soterre

Soterre

Technology

Telecommunication Media Entertainment

Testing

Troubleshooting

Uncategorized

Upgrades

Version Control

Meta
Log in

Entries feed

Comments feed

WordPress.org
Get a Demo
Ready to streamline and automate your BI & analytics
platforms?

GET A DEMO $

Free Software
Download MotioPI Freeware. Or request MotioPI or MotioCI
for free use with your Cognos Analytics upgrade.

FREE SOFTWARE $

Solutions Products
IBM Cognos Analytics MotioCI
IBM Planning Analytics Soterre
Qlik ReportCard
Tableau MotioPI
MotioCAP
Persona IQ
Digital Hive

About Us Contact
Our Team 7161 Bishop Road
Our History STE 200

Our Partners Plano, TX 75024


Tel:1 (972) 447-9595
Careers
Toll Free:1 (877) 362-8708
Resources
Case Studies
Blog

Also of Interest
Telus Increases Speed and Efficiency of Cognos Releases
Compare Objects Across Two Cognos Environments
LDAP and Active Directory Support in MotioPI

© 2021 Motio. All rights reserved ● Terms and Conditions ● Privacy Policy

You might also like