Pages

Tuesday 13 May 2014

OBIEE 11g: Exceeded configured maximum number of allowed input records

Oracle have limited various numbers of outputs for rows, columns, prompts, cells and pivots to default values. These values are ususally set high enough to never become an issue, but for really large reports you will bump up against an error message saying you have exceeded some maximum.

So what do you do?

Step 1

Edit the instanceconfig.xml file. Typically, this file is located in the following directory: <OBIEE11g_install>\instances\instance1\config\OracleBIPresentationServicesComponent\coreapplication_obips1
If you are running a load balanced environment please make sure all environments have the same settings (or share the same settings) or you'll be forever chasing down 'weird' customer experiences.

Step 2

There are parts of the instanceconfig.xml you cannot edit directly and those sections will be marked as only being updateable via Oracle Enterprise Manager. The parts we can edit I have marked in bold.

<Views>
<Pivot>
<!--This Configuration setting is managed by Oracle Enterprise Manager Fusion Middleware Control-->
<DefaultRowsDisplayedInDelivery>75</DefaultRowsDisplayedInDelivery>
<!--This Configuration setting is managed by Oracle Enterprise Manager Fusion Middleware Control-->
<DefaultRowsDisplayedInDownload>2500</DefaultRowsDisplayedInDownload>
<!--This Configuration setting is managed by Oracle Enterprise Manager Fusion Middleware Control-->
<DisableAutoPreview>false</DisableAutoPreview>
<MaxCells>10000000</MaxCells>
<MaxVisibleColumns>1000</MaxVisibleColumns>
<MaxVisiblePages>1000</MaxVisiblePages>
<MaxVisibleRows>10000</MaxVisibleRows>
<MaxVisibleSections>25000</MaxVisibleSections>
<DefaultRowsDisplayed>128000</DefaultRowsDisplayed>
</Pivot>
<Table>
<!--This Configuration setting is managed by Oracle Enterprise Manager Fusion Middleware Control-->
<DefaultRowsDisplayedInDelivery>75</DefaultRowsDisplayedInDelivery>
<!--This Configuration setting is managed by Oracle Enterprise Manager Fusion Middleware Control-->
<DefaultRowsDisplayedInDownload>2500</DefaultRowsDisplayedInDownload>
<MaxVisiblePages>1000</MaxVisiblePages>
<MaxVisibleRows>10000</MaxVisibleRows>
<MaxVisibleSections>2000</MaxVisibleSections>
<DefaultRowsDisplayed>64000</DefaultRowsDisplayed>
<MaxCells>10000000</MaxCells>
</Table>
</Views>

Note that the value you choose for <MaxCells> should never be more than <MaxVisibleColumns> * <MaxVisibleRows>.

In the case above <MaxVisibleColumns>1000</MaxVisibleColumns> * <MaxVisibleRows>10000</MaxVisibleRows> = <MaxCells>10000000</MaxCells>

Please restart all presentation services to pick up the new settings.

Monday 28 April 2014

OBIEE - Catalog Cleanup

Catalog Cleanup


Do you sometimes look at the catalog and wonder what kind of timebomb is lurking in there? I have nightmares. You've designed a catalog layout and asked developers to follow standards and follow the guidelines, but one year later......? What about two or even three years later? What Answers work has been started and abandoned, unfinished and unworking? What Answers remain from 'deleted' or 'hidden' dashboards and now no longer work due to repository changes?

The challenge


We are a small team of developers so so for us automation of tasks is key. For each report in the catalog what we need to know is:
Does this report generate a useable query? (Is it broken)?
Is this report on a dashboard, or more than one dashboard?
If the report is on a dashbord, is it in the correct directory?

We anticipate that this will be repeated periodically, you can never be sure what creeps in and slips through the cracks, but at least the generation of the report can be automated and scheduled.

At a high level what we do is to combine the output of some Catalog reports output, and use NQCMD to generate logical queries and track the outputs.

The Catalog Reports


Firstly we will generate a Dashboard Report and then an Analysis Report, and we'll upload the data into some database tables and stitch the two together with an outer join so we can see those reports on dashboards as well as those without.

We call the Analysis Report and a Dashboard with a script like below; Note, your system will have different settings.

D:\OBI\instances\instance1\bifoundation\OracleBIPresentationServicesComponent\coreapplication_obips1\catalogmanager\runcat.cmd -cmd report -excelFormat -distinct -offline D:\OBI\instances\instance1\bifoundation\OracleBIPresentationServicesComponent\coreapplication_obips1\catalog\samplesales -forceOutputFile D:\CatalogCapture\Analysis.txt -type "Analysis" "Name" "Path" "Table" "Column" "Subject Area"

D:\OBI\instances\instance1\bifoundation\OracleBIPresentationServicesComponent\coreapplication_obips1\catalogmanager\runcat.cmd -cmd report -excelFormat -distinct -offline D:\OBI\instances\instance1\bifoundation\OracleBIPresentationServicesComponent\coreapplication_obips1\catalog\samplesales -forceOutputFile D:\CatalogCapture\Dashboard.txt -type "Dashboard" "Name" "Path" "Dashboard Style" "Dashboard Page Name" "Dashboard Page Path" "Dashboard Page Description" "Analysis Name" "Analysis Path"

Join the two outputs on "Path" and "Analysis Path" - think outer join here because we still want to process those Analysis not on dashboards.

Generating a Logical Query


What we are going to do is pass the each analysis into NQCMD using the current repository to test whether a valid Logical Query.

Create a text file called cred.txt and make the contents something like
login=weblogic
pwd=weblogic123

Now using the report output from part 1 we can run the following to extract the query from the catalog for the answer. The Query will be saved into a file called myquery.lsql
D:\OBI\instances\instance1\bifoundation\OracleBIPresentationServicesComponent\coreapplication_obips1\catalogmanager\runcat.cmd -cmd report -online http://%SERVR%/analytics/saw.dll -credentials "D:\CatalogCapture\creds.txt" -forceOutputFile "myquery.lsql" -folder "/shared/...../myreport" -type "Analysis" "SQL"

One of two things happen here:
1. The file containing the query is generated, or
2. An "empty file" is generated.

If we have an empty file the Answer failed to generate a Query and should be flagged for investigation.

However, if we have a query in the file, we need to pass the contents through NQCMD to see if the Answer works.

Extending into NQCMD


We have previously used NQCMD to do impact analysis on repository changes and can use the functionality almost unchanged.

"D:\OBI\Oracle_BI1\bifoundation\server\bin\nqcmd.exe" -d coreapplication_OH140093583 -u weblogic -p weblogic123 -s "myquery.lsql" -o "output.lres"

Once again one of two things can happen here:
1. The logical query passed in returns a result, or
2. An error is generated.

The error may be something like

[10058][State: S1000] [NQODBC] [SQL_STATE: S1000] [nQSError: 10058] A general error has occurred.
[nQSError: 43113] Message returned from OBIS.
[nQSError: 43119] Query Failed: 
[nQSError: 17001] Oracle Error code: 942, message: ORA-00942: table or view does not exist
 at OCI call OCIStmtExecute.
[nQSError: 17010] SQL statement preparation failed.
Statement execute failed


Processed: 1 queries
Encountered 1  errors

Once again flag the error.

Wrapping things up and bringing it together


From each Answer we now know
1. Is it used on a dashboard?
2. Where is the Answer stored?
3. Does it generate a logical query?
4. Does the logical query work?

Now we can focus efforts on fixing, rewriting or deleting those Analysis on dashboards, then those that are not on a dashboard but do not work.
Your initial steps may just be to identify those Answers that don't generate a logical query for a maintenance routine?

Thursday 24 April 2014

OBIEE - OBI_METRICS_AGENT on Windows

OBI_METRICS_AGENT


I would like to firstly extend my thanks to Rittman Mead for making this freely available. It managed to fill the gap in performance analysis that I had been struggling to fill. Yes, I had started down the WLST route and was pulling my hair out.

So what did I do to get it to work for me?

So what's the setup: OBIEE 11.1.1.7 on Windows Server 2008 64.

Firstly, download the zip file containing the obi_metrics_agent from https://github.com/RittmanMead/obi-metrics-agent.

The zip file contains a python script, and being windows nothing is easy, we have to go and download and install python. Please make sure you download the correct version for your machine. I donwloaded Python 2.7 for 64Bit Windows (python-2.7.6.amd64.msi) and ran the installer and put Python into C:\Python27.

Job done....... not quite. The obi_metrics_agent needs the lxml library. Fortunately my trusty steed "Google" was at hand and kept nudging me in the right direction.

It transpires that you need to run ez_setup.py and this will download the setup tools allowing you to "easy install" other packages and any dependant packages. Download the ez_setup.py script and place it in the C:\Python27 directory.

Easy I thought and from the command line: C:\Python27\python.exe ez_setup.py and apart from a few error messages about proxy unavailable.... nothing. A few minutes of head-scratching later and I realised that none of the Servers can access the internet, so a few minutes more on the good steed "Google" grazing the plains of knowledge, I dowloaded the setuptools-3.4.4.zip file and placed it in the C:\Python27 directory. Now the ez_setup.py script command completed without error.

Now I have the easy_install.exe and can download the lxml-2.3-py2.7-win-amd64.egg file - once again make sure you get the correct version for your Python install. You will find the easy_install.exe in the Python27\scripts folder, so once again placing the egg file on the server we run c:\Python27\Scripts\easy_install.exe lxml-2.3-py2.7-win-amd64.egg. This unpacks into the C:\Python27\Lib\site-packages directory.

So, if everything has worked so far, you are nearly good to go.

I then unpacked the obi_metrics_agent.zip into the middleware home directory (on my machine D:\OBI) which created a subdirectory off called D:\OBI\obi-metrics-agent-master and here's where the main script resides.

After some faffing around, I settled on the following as appropriate for my needs, yours may be different.

C:\Python27\python.exe D:\OBI\obi-metrics-agent-master\obi-metrics-agent.py --opmnbin D:\OBI\instances\instance1\bin\opmnctl.bat --data-directory D:\OBI\obi-metrics-agent-master\data --interval 10

Now I must admit I have spent a fair amount of time going through the different options and outputs and it realy works, I'll eave it to you to judge, but thanks Rittman Mead.

I have jobs that run every couple of minutes loading the data into a database allowing us to monitor the internals against usage - amazing!

Tuesday 18 March 2014

OBIEE - Automatically Generate Repository Documentation

I have strived to close the loop with automating all of the internal reporting on OBIEE.

So far we have automated the capture of what reports, filters and prompts are on our dashboards.

We have in place an automatic system for recording the performance of the reports on the current dashboards. This we run every night and the results uploaded to the database.

We have the option of running an 'after' script which compares the output from dashboards (logical sql, row counts and timings) when we upload a new repository, this gives us impact analysis.

What we lacked was a way to automate the generation of the repository documentation. This was the manual step that was sometimes skipped.

Having a report on the repository, from physical to presentation and then joining this data to what's on the dashboards gives us a view of what's being used where and where the data is sourced from.

Now I must admit, this seems like a 'hack' and I wouldn't use it on any environment where I couldn't freely break things without causing damage. Oh, and it is windows only - there is no AdminTool for Linux/Solaris - sigh.

The Repository Command Script

So what do we have to do?

We are going to use the /command switch of the AdminTool to run a script of, you guessed it, .... commands.

The command script we've called report_rep.build and put in the following few lines, replacing MWHOMW with your own settings, repository with your own and the repository password with your own.

OpenOffline <MWHOME>\instances\instance1\bifoundation\OracleBIServerComponennt\coreapplication_obis1\repository\repository.rpd <repository password>
DescribeRepository D:\scriptoutput\repository_report.csv UTF-8
Close
Exit

The next part involves creating a script to run the AdminTool and pass in the script file. Just to be safe it may be worth setting the environment using some of the bits from the bi-init.cmd as follows:

@echo off
set ORACLE_HOME=<MWHOME>\Oracle_BI
set ORACLE_INSTANCE=<MWHOME>\instances\instance1
set ORACLE_BI_APPLICATION=coreapplication
set JAVA_HOME=<MWHOME>\Oracle_BI1\jdk
set ESSBASEPATH=%ORACLE_HOME%\clients\epm\Essbase\EssbaseRTC
call %ORACLE_INSTANCE%\bifoundation\OracleBIApplication\%ORACLE_BI_APPLICATION%\setup\user.cmd
set PATH=%ORACLE_HOME%\common\ODBC\Merant\5.3\Drivers;%ORACLE_HOME%\bifoundation\server\bin;%ORACLE_HOME%\bifoundation\web\bin;%ESSBASEPATH%\bin;%ORACLE_HOME%\bin;%PATH%
<MWHOME>\Oracle_BI1\bifoundation\server\bin\AdminTool.exe /command d:\scripts\report_rep.build

Before running the above script we copy the latest repository to one side and call it repository.rpd. Not very imaginative I know but that way we don't muck around with the actual repository and, in theory, if we script it right, this will always be a copy of the latest and most up to date RPD.

You should be left with a CSV report of your repository (all being well).

Upload to your favourite database and analysis tool.



Tuesday 4 March 2014

OBIEE - Displaying the Value of System Variables

Following from the previous article here on the subject of displaying your repository variables, I thought it may be time to look at showing some of the system variables

On a dashboard, drag in a text item and edit the contents to include something like

We are running OBIEE version [u][b]-@{system.productVersion} [/b][/u]
This is the @{dashboard.name} dashboard
and we are on the @{dashboard.currentPage} page.

Monday 3 March 2014

OBIEE - Report Footer

Report Footer

You can create a report footer by adding another Title view and editing the layout.
- set the display text to custom and enter "Confidential" or some sort of disclaimer text.
- set the style to show the text as centered, bold and Italic.
- there you go, a confidential footer for your report.

Friday 28 February 2014

OBIEE - Aggregation Rules and Functions

The following list describes the aggregation rules that are available for measure columns in the "Layout pane" and for columns in the "Edit Column Formula dialog: Column Formula tab". The list also includes functions that you can use when creating a calculated item.

  • Default — Applies the default aggregation rule as in the Oracle BI repository or by the original author of the analysis. Not available for calculated items.
  • Server Determined — Applies the aggregation rule that is determined by the Oracle BI Server (such as the rule that is defined in the Oracle BI repository). The aggregation is performed within Presentation Services for simple rules such as Sum, Min, and Max. Not available for measure columns in the Layout pane or for calculated items.
  • Sum — Calculates the sum obtained by adding up all values in the result set. Use this for items that have numeric values.
  • Min — Calculates the minimum value (lowest numeric value) of the rows in the result set. Use this for items that have numeric values.
  • Max — Calculates the maximum value (highest numeric value) of the rows in the result set. Use this for items that have numeric values.
  • Average — Calculates the average (mean) value of an item in the result set. Use this for items that have numeric values. Averages on tables and pivot tables are rounded to the nearest whole number.
  • First — In the result set, selects the first occurrence of the item for measures. For calculated items, selects the first member according to the display in the Selected list. Not available in the Edit Column Formula dialog.
  • Last — In the result set, selects the last occurrence of the item. For calculated items, selects the last member according to the display in the Selected list. Not available in the Edit Column Formula dialog.
  • Count — Calculates the number of rows in the result set that have a nonnull value for the item. The item is typically a column name, in which case the number of rows with nonnull values for that column are returned.
  • Count Distinct — Adds distinct processing to the Count function, which means that each distinct occurrence of the item is counted only once.
  • None — Applies no aggregation. Not available for calculated items.
  • Server Complex Aggregate — Applies the aggregation rule that is determined by the Oracle BI Server (such as the rule that is defined in the Oracle BI repository). The aggregation is performed by the Oracle BI Server, rather than within Presentation Services. Not available for calculated items.
  • Report-Based Total (when applicable) — If not selected, specifies that the Oracle BI Server should calculate the total based on the entire result set, before applying any filters to the measures. Not available in the Edit Column Formula dialog or for calculated items. Only available for attribute columns.

OBIEE - Connection Pools



Apart from creating your connection pool, filling in the type of database, username and password, do you give much thought to the connection pool.

If you have heavy usage, you should increase the maximum number of connections from the default of 10.
You will definately need a connection pool seperate from the others to set your session variables.
You will also need a separate connection pool for usage tracking.
Keep a separate connection pool for write back.

Just a few thoughts

Thursday 27 February 2014

OBIEE - Custom Login Message

OBIEE - Customised Login Message


The first thing your users see if they need to login id the login screen and with a little effort you can customise this to "up" the wow factor and provide information.

We will customise the following file.
OBIEE_HOME\Oracle_BI1\bifoundation\web\msgdb\pages\common\signin.html

and as always !!!BACKUP!!! before making changes.

On our non production environments we have a message like below.
...
<div align="center" class="centerDiv">
<table width="180" border="0" cellpadding="0" cellspacing="0">
   <tr>
      <td class="lefttop"></td>
      <td class="topmiddle" width="100%" height="12"><span style="background-color:#F6CED8;text-align:center;font-weight:bold;color:#000000">
        You are connecting to a Non-Production environment!
   </span></td>
      <td class="righttop"></td>
   </tr>

   <tr>...

Customised Login Images

You can go further by adding images

Locate the <form id="logonForm" tag, then the table tag below it:
            <table width="100%" cellpadding="0" cellspacing="0">

Underneath add the following lines (assuming that Your_Logo.png is the image):

<tr>
<td>
<img src="fmap:login/Your_Logo.png" style="visibility: visible;" class="defaultHidden" width="280" align="bottom" height="120" alt="Description of Your Logo"/>
</td>
</tr>

Remember if you are uploading an image to put the "Your_Logo.png" file in the following three locations

OBIEE_HOME\user_projects\domains\bifoundation_domain\servers\bi_server1\tmp\_WL_user\analytics_11.1.1\7dezjl\war\res\sk_blafp\b_mozilla_4

OBIEE_HOME\user_projects\domains\bifoundation_domain\servers\bi_server1\tmp\_WL_user\analytics_11.1.1\7dezjl\war\res\sk_blafp\login

OBIEE_HOME \Oracle_BI1\bifoundation\web\app\res\sk_blafp\login

Wednesday 26 February 2014

OBIEE - Sample Administration Documentation

As I mentioned in Part 1 of a previous article here, you really need to have documentation around your OBIEE environment.


Not the most glamerous of suggestions surely, but your boss and sysadmin will love you. Start by documenting where to find stuff middleware home, logfiles, config files (instanceconfig.xml etc.), the URLs used for administration.

Then move on to documenting the config changes made after the install, what settings were changed and if known what the default values were before and what they are now. What are the connection details for your RCU schemas (BIPLATFORM and MDS). Capture things like server names and directory structures.

Now comes the living documents which you can generate periodically and version. Run off a repository report and catalog reports. This will give you (and your colleagues) a better understanding of what the repository looks like and also what in the catalog is where on the dashboards.

As an added benefit you can now work backwards from the catalog reports through the repository reports to see where data comes from to fill reports, and vice-versa, from a physical column work your way through the repository to the catalog report to see where the data is used.

Remember, having no documentation means a risk to your business.

And keep it up to date, mark your calendar for a monthly review, most times , no change, but there will be that month.......

Here's a suggestion of some topics to get you started!

General Info about your OBIEE Install

Server
URL
DNS Shortcut
WebLogic Console
Enterprise Manager
Repository Name
Catalog Path

Install Type Information

Type
OS
Install Dir

OBIEE Configuration

writeback
EVALUATE_SUPPORT_LEVEL
Usage Tracking
Repository Monitoring

Users/Passwords

Schema Prefix
BIPLATFORM
MDS
weblogic
Repository


Custom Deployments

password_change.ear

Maps

Tuesday 25 February 2014

OBIEE Lockdown - Part 1

Security Recommendations in Oracle Business Intelligence Installations


There are three key security recommendations for installing and securing OBIEE in your enterprise environment.

1.       Be serious about password management.


It is important to lock down the OBIEE environment and changing default passwords after installation.

The passwords for weblogic user should not be simple to guess or widely distributed. The best practice is to have different passwords throughout the environments. Moreover, I’d recommend replacing “weblogic user” with another obfuscated username. The username and password combination should be different throughout the environments (at least the password).

Change the default passwords for the RPD, as everyone knows “Admin123.”  If you are promoting an RPD, your password for Dev, Test, and Prod RPDs should be different as well to avoid a possibility of a developer making changes in a wrong environment.

Do not use the administrative account to present reports to the users and avoid providing it to users, there is high risk of breaking/deleting something by accident. Consider automating deployments to avoid having to deal with passwords manually.

2.       Take the authentication and authorization outside of OBIEE


Use whatever is standard in your organization for managing authentication (be it OID, Active Directory, or some other sort of LDAP). Not only does it improve user experience by removing the need to memorize yet another username/password, but it also saves time for the development team. Remember: They are not in the user management business. Another great use case is the ability to push this functionality further by automating users’ authorization by managing LDAP groups’ memberships and mapping them to the OBIEE’s application roles.

3.       RPD Security Objects


Unfortunately, the Administration Tool does not allow development separation within the repository objects (one needs to be an administrator to edit the RPD file – no way around it), hence anyone with the RPD/Admin password can do pretty much anything within the RPD. At this time, there are limited built-in options to audit or keep track of the RPD development changes. One must rely on a custom developed solution if this functionality is important. An easy and practical option is to keep an XLS spreadsheet of the major development milestones. It’s simple; however, its disadvantage is, that there is an overhead associated with maintaining it, especially during active development periods. Another option would be to use an Administration tool utility to create metadata repository document on a regular basis and track deltas. Finally, more advanced options would include MUDE (multi user development environment) projects and XML export.

Friday 21 February 2014

OBIEE - Increase Map Joins

OBIEE has a default limit of 500 joins to a map.

There are occasions where you may need to up this limit and you will need to make changes to this setting if more points of contact to the map exist.

You will need to add something like the below into your instanceconfig.xml file

<SpatialMaps>
<ColocatedOracleMapViewerContextPath>/mapviewer</ColocatedOracleMapViewerContextPath>
<RemoteOracleMapViewerAbsoluteURL/>
<LayerDataLayout>
<MaxRecords>1500</MaxRecords>
</LayerDataLayout>
</SpatialMaps>

Thursday 20 February 2014

OBIEE Repository Variable Report

You can create a report to show what your repository variables are set to by doing something like below:

Create a 'dummy' report with a static text view and enter something like below and make sure you have selected the contains HTML Markup box.

[u][b]Predefined Presentation Variables[/b][/u][br/]
[b]session.language:[/b] @{session.language}[br/]
[b]session.locale:[/b] @{session.locale}[br/]
[b]DISPLAYNAME:[/b]@{biServer.variables['NQ_SESSION.DISPLAYNAME']}[br/]
[b]USER:[/b]@{biServer.variables['NQ_SESSION.USER']}[br/]
[b]GROUP:[/b]@{biServer.variables['NQ_SESSION.GROUP']}[br/]
[b]ROLES:[/b]@{biServer.variables['NQ_SESSION.ROLES']}[br/]
[b]ROLEGUIDS:[/b]@{biServer.variables['NQ_SESSION.ROLEGUIDS']}[br/]
[b]PERMISSIONS:[/b]@{biServer.variables['NQ_SESSION.PERMISSIONS']}[br/]
[b]USERGUID:[/b]@{biServer.variables['NQ_SESSION.USERGUID']}[br/]

Update: See more on System Variables here.

Wednesday 5 February 2014

OBIEE - Repository System Session Variables

Session variables are similar to dynamic repository variables in that they obtain their values from initialization blocks. Unlike dynamic repository variables, however, the initialization of session variables is not scheduled. When a user begins a session, the Oracle BI Server creates new instances of session variables and initializes them.

Two types of session  variables:
  • System Session Variables
  • Nonsystem Session Variables

Sytem Session Variables : 

System session variables are session variables that the Oracle BI Server and Oracle BI Presentation Services use for specific purposes. System session variables have reserved names that cannot be used for other kinds of variables (such as static or dynamic repository variables and nonsystem session variables).

When you use these variables for Oracle BI Presentation Services, preface their names with NQ_SESSION. For example, to filter a column on the value of the variable LOGLEVEL, set the filter to the variable NQ_SESSION.LOGLEVEL.

 
Variable Description
USER Holds the value the user enters as his or her logon name. This variable is typically populated from the LDAP profile of the user.
PROXY Holds the name of the proxy user. A proxy user is a user that has been authorized to act for another user.
GROUP Contains the groups to which the user belongs. Exists only for compatibility with previous releases. Legacy groups are mapped to application roles automatically.
WEBGROUPS Specifies the Catalog groups (Presentation Services groups) to which the user belongs, if any. Note that the recommended practice is to use application roles rather than Catalog groups.
USERGUID Contains the global unique identifier (GUID) of the user, typically populated from the LDAP profile of the user.
ROLES Contains the application roles to which the user belongs.
ROLEGUIDS Contains the global unique identifiers (GUIDs) for the application roles to which the user belongs. GUIDs for application roles are the same as the application role names.
PERMISSIONS Contains the permissions held by the user, such as oracle.bi.server.impersonateUser or oracle.bi.server.manageRepository.
DISPLAYNAME Used for Oracle BI Presentation Services. It contains the name that is displayed to the user in the greeting in the Oracle BI Presentation Services user interface. It is also saved as the author field for catalog objects. This variable is typically populated from the LDAP profile of the user.
PORTALPATH Used for Oracle BI Presentation Services. It identifies the default dashboard the user sees when logging in (the user can override this preference after logged on).
LOGLEVEL The value of LOGLEVEL (a number between 0 and 5) determines the logging level that the Oracle BI Server uses for user queries.
This system session variable overrides a variable defined in the Users object in the Administration Tool. If the administrator user (defined upon install) has a Logging level defined as 4 and the session variable LOGLEVEL defined in the repository has a value of 0 (zero), the value of 0 applies.
REQUESTKEY Used for Oracle BI Presentation Services. Any users with the same nonblank request key share the same Oracle BI Presentation Services cache entries. This tells Oracle BI Presentation Services that these users have identical content filters and security in the Oracle BI Server. Sharing Oracle BI Presentation Services cache entries is a way to minimize unnecessary communication with the Oracle BI Server.
SKIN Determines certain elements of the look and feel of the Oracle BI Presentation Services user interface. The user can alter some elements of the user interface by picking a style when logged on to Oracle BI Presentation Services. The SKIN variable points to an Oracle BI Presentation Services folder that contains the nonalterable elements (for example, figures such as GIF files). Such directories begin with sk_. For example, if a folder were called sk_companyx, the SKIN variable would be set to companyx.
DESCRIPTION Contains a description of the user, typically populated from the LDAP profile of the user.
USERLOCALE Contains the locale of the user, typically populated from the LDAP profile of the user.
DISABLE_CACHE_HIT Used to enable or disable Oracle BI Server result cache hits. This variable has a possible value of 0 or 1.
DISABLE_CACHE_SEED Used to enable or disable Oracle BI Server result cache seeding. This variable has a possible value of 0 or 1.
DISABLE_SUBREQUEST_CACHE Used to enable or disable Oracle BI Server subrequest cache hits and seeding. This variable has a possible value of 0 or 1.
SELECT_PHYSICAL Identifies the query as a SELECT_PHYSICAL query..
DISABLE_PLAN_CACHE_HIT Used to enable or disable Oracle BI Server plan cache hits. This variable has a possible value of 0 or 1.
DISABLE_PLAN_CACHE_SEED Used to enable or disable Oracle BI Server plan cache seeding. This variable has a possible value of 0 or 1.
TIMEZONE Contains the time zone of the user, typically populated from the LDAP profile of the user.


The SELECT_PHYSICAL command provides the functionality to directly query objects in the Physical layer of the metadata repository, and to nest such a statement within a query against the Business Model and Mapping layer or the Presentation layer.

Syntax for the SELECT_PHYSICAL Statement

Basic syntax for SELECT_PHYSICAL queries is equivalent to  basic syntax of select statement  with the term

SELECT_PHYSICAL replacing the word SELECT, namely:

SELECT_PHYSICAL [DISTINCT] select_list
FROM from_clause
[WHERE search_condition]
[GROUP BY column {, column}
     [HAVING search_condition]]
[ORDER BY column {, column}] 
 
 Source:
http://download.oracle.com/docs/cd/E14571_01/bi.1111/e10540/variables.htm#BIEMG3104

Friday 24 January 2014

Smooth(er) OBIEE Administration

I am one of those people who likes to know how things work. I'm constantly digging through the Oracle guides for OBIEE and trawling the net to find those nuggets of information that will make my life easier.

I have a confession to make at this point. I will go out of my way to avoid doing repetitive (boring) work. Like most people I would rather do  something new and exiting rather than push the same buttons in the same sequence day after day.

OBIEE administration is one of those jobs that could suck up hours and hours of your day unless you think smart. Unfortunately, most environments expect you to 'get stuck in' and then before you know it ......... the same thing day-after-day. Every OBIEE site has common issues that they struggle with; migration across environments, performance, testing, development, lineage.....

Part 1. OBIEE Documentation: 


Not the most glamerous of suggestions surely, but your boss and sysadmin will love you. Start by documenting where to find stuff middleware home, logfiles, config files (instanceconfig.xml etc.), the URLs used for administration.

Then move on to documenting the config changes made after the install, what settings were changed and if known what the default values were before and what they are now. What are the connection details for your RCU schemas (BIPLATFORM and MDS). Capture things like server names and directory structures.

Now comes the living documents which you can generate periodically and version. Run off a repository report and catalog reports. This will give you (and your colleagues) a better understanding of what the repository looks like and also what in the catalog is where on the dashboards.

As an added benefit you can now work backwards from the catalog reports through the repository reports to see where data comes from to fill reports, and vice-versa, from a physical column work your way through the repository to the catalog report to see where the data is used.

Remember, having no documentation means a risk to your business. View sample documentation here.

Part 2.  OBIEE Automation:


You don't already? You will need to think of all those tasks you do and tackle them one-by-one. It may take you some time to get to a state where executing a single script will do an entire day's work, but you can certainly make a start by taking simple startup and shutdown tasks and automating them. Join them together shutdown then startup, there you go you have a script to 'bounce' the environment. How much time did that save?

You will need to have scripts to help with testing connection pools and migrating RPDs and catalogs. You can generate scripts that 'auto-correct', they start up an environment if it is down or restart components of OPMN. How about migrating users and groups across environments.

When you create a script you put all the effort in up front and reap the benefits in the days that follow.

Part 3. OBIEE Procedures:


There is nothing wrong with doing all your tasks in an ad-hoc manner for a Proof of Concept Project. The nature of the POC is to rapidly change things to reflect the change in your customer's requirements.

But what ad-hoc work does is distract you and eats up your time. Before you know it, you've spent an entire day ad-hoccing (is that a word?) and have nothing to show for it.

If you have a thread of automation or a deployment script for promotion across environments then with a little planning and communication you can ease the workload.

So what do you need to know to deploy? What are you deploying, from where to where and what are the configuration details and security settings?

This doesn't have to become something heavy and cumbersome with reams of documentation, a couple of lines and provides project history and documentation as a byproduct.

Each project will have it's own lifecycle and demands but setting up a process and schedule for releases, do you absolutely have to release the latest developer build to test immediately or can it wait until an agreed time where several pieces of work can be batched and planned in. Immediately the level of distraction for you has dropped.

In conclusion, as you start to fill in some of the parts of administration, you start a feedback cycle that will over time reduce your workload, increase your inderstanding of OBIEE and make you more productive

Tuesday 21 January 2014

Generating a OBIEE Performance Test Harness

You know you should be testing you OBIEE stack for performance, but what exactly should you be doing, and how can you do it?

Another question may be what is your baseline? Do you have a known baseline for performance? What tools do you use?

What we want is a method of generating a test plan where we can run (repeatably) the reports on our dashboards. Ideally we would like to generate a series of tests that we can run before a release to generate a baseline and after a release to ensure that nothing (that we are testing) is broken.

So what is available to us "out of the box" that we can plug in and start using straight away?

We will use a couple of components to build up a test.

  • Usage Tracking
  • NQCMD


Usage tracking: 

Turn on usage tracking for one of your users and open a few dashboards and reports. If you have things configured correctly you should see some entries in the S_NQ_ACCT table (I have made the assumption that you will enable database recording of usage tracking - see how to set this up here.)

The S_NQ_ACCT table tells us what components were run when you viewed the dashboard. To find out what components make up usage tracking query the "QUERY_SRC_CD" column. A sample of results may show something like:
DashboardPrompt
Report
ValuePrompt
Memeber Browser Display Values
Soap
rawSQL

The logical query run for each component is in the "QUERY_TEXT" column, though this truncates the full query after a number of characters and, if like me you have some very large logical queries, it may be worthwhile getting the query out of the "QUERY_BLOB" column.

NQCMD:

I have already discussed NQCMD here but it may be worthwhile going over some of the concepts briefly. NQCMD is a mechanism for executing a logical query against a repository for a user. This means we can execute a logical query against a repository of our choice and run the query as a named user.

The command is in the format
MW_HOME\Oracle_BI1\bifoundation\server\bin\nqcmd.exe -d coreapplication_OH140000583 -u <username> -p <password> -s <Query input file> -T -q  1> output.out

Running the help gives the following output
MW_HOME\Oracle_BI1\bifoundation\server\bin\nqcmd.exe -help

-------------------------------------------------------------------------------
          Oracle BI ODBC Client
          Copyright (c) 1997-2011 Oracle Corporation, All rights reserved
-------------------------------------------------------------------------------


Argument error near: -help
Command: MW_HOME\Oracle_BI1\bifoundation\server\bin\nqcmd.exe - a command line client which can issue SQL statements
                 against either Oracle BI server or a variety of ODBC compliant backend databases.
SYNOPSIS
         nqcmd [OPTION]...
DESCRIPTION
         -d<data source name>
         -u<user name>
         -p<password>
         -s<sql input file name>
         -o<output result file name>
         -D<Delimiter>
         -C<# number of fetched rows by column-wise binding>
         -R<# number of fetched rows by row-wise binding>
         -a (a flag to enable async processing)
         -f (a flag to enable to flush output file for each write)
         -H (a flag to enable to open/close a request handle for each query)
         -z (a flag to enable UTF8 instead of ACP)
         -utf16 (a flag to enable UTF16 instead of ACP)
         -q (a flag to turn off row output)
         -NoFetch (a flag to disable data fetch with query execution)
         -NotForwardCursor (a flag to disable forwardonly cursor)
         -v (a flag to display the version)
         -SessionVar <SessionVarName>=<SessionVarVAlue>

To use NQCMD we need to pass a logical query SQL in a file.
The logical query file can look something like

SET VARIABLE PRODUCT_NAME='T00103', DISABLE_PLAN_CACHE_HIT=1, DISABLE_CACHE_HIT=1, LOGLEVEL=2, QUERY_NAME='Q0213EB1783381CBEC92AB1166D5FF8FC', TEST=1:SELECT .................. FETCH FIRST 25001 ROWS ONLY

Now we have looked at the two parts we'll use let's put it all together.

To start we need to mine the usage tracking table for the logical queries of the reports on our dashboards that we wish to test.
Each query should be in its own file.
Ideally each query should be uniquely identifiable, so we can trace a query back to a relevant report or dashboard and also as a means of identification when running the test.

Generating the test logical query input files: I have knocked up a SQL script that extracts the queries from the Usage Tracking Table and saves them in a series of files. Because PL/SQL cannot spool to multiple files dynamically, the following script generates an intermediate file, which is then called and executed.

SET TERMOUT OFF
SET SERVEROUTPUT ON
SET ECHO OFF
SET DEFINE OFF
SET FEEDBACK OFF
SET LINESIZE 5000
SPOOL intermediate_file.sql

DECLARE
  CURSOR cr_qrys  IS
      SELECT DISTINCT TO_CHAR (query_blob) AS query_text, query_key
        FROM s_nq_acct
       WHERE     query_src_cd = 'Report'
             AND dbms_lob.getlength (query_blob) <= 3900
             AND ERROR_TEXT IS NULL
             AND row_count > 0
    ORDER BY query_key;

  i          NUMBER := 0;
  filename   VARCHAR2 (5);
  p1         VARCHAR2 (5000);
BEGIN
  dbms_output.enable (NULL);

  FOR lr_qrys IN cr_qrys LOOP
    i          := i + 1;
    filename   := LPAD (TO_CHAR (i), 5, '0');
    p1         :=
         'SET VARIABLE PRODUCT_NAME=''T'
      || filename
      || ''', DISABLE_PLAN_CACHE_HIT=1, DISABLE_CACHE_HIT=1, LOGLEVEL=2, QUERY_NAME=''Q'
      || lr_qrys.query_key
      || ''', TEST=1:'
      || lr_qrys.query_text;
    dbms_output.put_line ('SET COLSEP ,  
SET PAGESIZE 0
SET TRIMSPOOL ON 
SET HEADSEP OFF
SET FEEDBACK OFF
SET DEFINE OFF
SET ECHO OFF
SET TERMOUT OFF
SET LINESIZE 5000
SPOOL test_file_' || filename || '.inp
SELECT q''[' || p1 || ']''
FROM DUAL;
SPOOL OFF');
  END LOOP;
END;
/

SPOOL OFF
PROMPT     executing intermediate file
@intermediate_file.sql;
SET SERVEROUTPUT OFF

This generates a series of files with the following structure test_file_xxxxx.inp where the xxxxx matches the test PRODUCT_NAME variable of the query.

This is our the first version of our baseline series of tests. As new dashboards and reports are released you will need to update the test input files with the new report queries and potentially remove those reports that are no longer relevant.

The next step is to pass each file through the NQCMD in turn and capture the output. Assuming the logical query test input files are in a subdirectory called testcases (and using DOS) we can loop through the .inp files as follows

for /r %%i in (testcases\*) do (
<mw_home>\Oracle_BI1\bifoundation\server\bin\nqcmd.exe -d <dsn to repository> -u <username> -p <password> -s %%i -T -q >%%~ni.out
)

Once the above loop has finished we will have a series of files called test_file_xxxxx.out

Peering into an output file you will see they have a structure as detailed below


-------------------------------------------------------------------------------
          Oracle BI ODBC Client
          Copyright (c) 1997-2011 Oracle Corporation, All rights reserved
-------------------------------------------------------------------------------


SET VARIABLE PRODUCT_NAME='T03818', DISABLE_PLAN_CACHE_HIT=1, DISABLE_CACHE_HIT=1, LOGLEVEL=2, QUERY_NAME='Q4DFD2F426D93BC2DA37036577533BF8B', TEST=1:SELECT ........... ASC NULLS LAST, 3 ASC NULLS LAST FETCH FIRST 200001 ROWS ONLY

SET VARIABLE PRODUCT_NAME='T03818', DISABLE_PLAN_CACHE_HIT=1, DISABLE_CACHE_HIT=1, LOGLEVEL=2, QUERY_NAME='Q4DFD2F426D93BC2DA37036577533BF8B', TEST=1:SELECT .............................

--------------------------------------------------------------------------------------------------------------------------------------------------------
s_0          s_1                                                                                                  s_2          3 s_4                    
--------------------------------------------------------------------------------------------------------------------------------------------------------
--------------------------------------------------------------------------------------------------------------------------------------------------------
Clock time: batch start: 10:34:21.000 Query from: 10:34:22.000 to: 10:35:22.000 Row count: 15
 total: 59.920 prepare: 0.016 execute: 59.889 fetch: 0.015
Cumulative time(seconds): Batch elapsed: 60 Query total: 59.920 prepare: 0.016, execute: 59.889, fetch: 0.015, query count:  1, cumulative rows:  15

--------------------------------------------------------------------------------------------------------------------------------------------------------


Processed: 1 queries

You can see from the contents we have a wealth of information and using a couple of scripts you can extract the values for PRODUCT_NAME and the timings near the end of each file. When you process them together you should have something like

Date Test No Batch Start Query From Query To Row Count Total Time Prepare Time Execute Time Fetch Time Batch Elapsed Prepare Query Total Fetch Query Count Cumulative rows
21/01/2014 'T00005' 15:24:08 15:24:11 15:24:17 2 5.906 0.016 5.859 0.031 8 0.016 5.859 0.031 1 2
21/01/2014 'T00007' 15:24:21 15:24:22 15:24:23 9 0.593 0 0.578 0.015 1 0 0.578 0.015 1 9
21/01/2014 'T00008' 15:24:24 15:24:25 15:24:38 35 12.671 0.031 12.624 0.016 13 0.031 12.624 0.016 1 35

Which you can load into your favourite database and perhaps even build a dashboard on.





Monday 20 January 2014

OBIEE BUNDLE PATCH 11.1.1.7.140114

Patch 17886497: BI BUNDLE PATCH 11.1.1.7.140114


The latest patchset for OBIEE 11.1.1.7 has just come out.

The Oracle BI EE Suite Bundle Patch 11.1.1.7.140114 under the top-level patch 17886497 consists of the following component patches:
PatchAbstract
16913445
Oracle Business Intelligence Installer (BIINST)
17922352
Oracle Business Intelligence Publisher (BIP)
17300417
Enterprise Performance Management Components Installed from BI Installer 11.1.1.7.0 (BIFNDNEPM)
17922552
Oracle Business Intelligence Server (BISERVER)
17922596
Oracle Business Intelligence Presentation Services (BIPS)
17300045
Oracle Real-Time Decisions (RTD)
16997936
Oracle Business Intelligence ADF Components (BIADFCOMPS)
17922577
Oracle Business Intelligence Platform Client Installers and MapViewer
This weighs in at a hefty 2.5Gb and includes 8 patches

Don't forget to clear cache and read the readme documentation.

Tuesday 14 January 2014

OBIEE BI Mobile App Designer Installation


Christmas is out of the way, New Years resolutions made (and some broken - sigh).

I promised myself last year to take a look at the OBIEE Mobile App Designer module as soon as time permitted, and to cut a long story short, I made time.

What are the prerequisites for installation.


You must be running OBIEE 11.1.1.7.1, if you are not then you will need to patch up to this version:
1. If you are on versions below 11.1.1.6.* or less then patch to 11.1.1.7.0 first
2. Apply the patches 16556157 which is a collection of patches which you apply one after the other and the a required patch 16569379.
3. you are now in a position to apply patch 17220944 which is the BI Mobile App.

The first patch on to my 11.1.1.7.0 system was a hefty 2.5Gb download on Windows 64 and contained 8 patches and when we include the required patch managed to eat up a good couple of hours of my day to install.

Make sure you follow the readme documents, there's nothing surprising there, but pay attention and you should be fine.

The final step is to install the BI Mobile App (patch 17220944).


1. Stop the OBIEE components OPMN, Managed Server and then Admin Server.
2. Apply the patch using opatch apply (see your documentation for environment variable settings).

The following steps are to deploy and configure the Application and the surrounding security.


You will need to run the WebLogic Congiguration Assistant to deploy the OBI Mobile App Designer.

Start--> Oracle Weblogic--> Weblogic Server 11gR1--> Tools--> Configuration Wizard.
1. Select extend an existing domain.
2. Select a WebLogic Domain Directory (bifoundation_domain).
3. Select Extension Source (select Extend My domain using a template and select <mw_home>/Oracle_BI1/common/templates/applications/oracle.bimad_template_11.1.1.jar)
4. The next few screens do not apply and you can click through using Next to the Configuration Summary screen where you click Extend.
5. You are finished with the configuration Assistant and can close the wizard.

Now you will upgrade the security configuration.

1. start only the WebLogic Administration server, Not Managed server or OPNM. (<mw_home>\user_projects\domains\bifoundation_domain\bin\startweblogic.cmd)
2. Run the python WLST script to integrate with Oracle Fusion Middleware
cd <mw_home>/Oracle_BI1/install
now run
<mw_home>/Oracle_BI1/common/bin/wlst.cmd addMADCodeGrants.py t3://<yourservername>:7001 weblogic

Enable the Mobile App Designer from Presentation Services.

1. Edit the instanceconfig.xml file in (<mw_home>\instances\instance1\config\OracleBIPresentationServicesComponent\coreapplication_obips1)
2. Add the <EnableMAD> element above the closing </ServerInstance> tag as follows
  <ServerInstance>
    .....
     <AdvancedReporting>
         <EnableMAD>true</EnableMAD>
    </AdvancedReporting>
  </ServerInstance>

Start the Managed Server and OPMN.

Set up the local App Library 


1. Create a folder in the Catalog to be the local Apps Library. Under Shared Folders create a folder called named "Apps Library".
2. Open the xmlp-server-config.xml file. It is located under <DOMAIN_HOME>/config/bipublisher/repository/Admin/Configuration.
3. Add the property "APPS_LIBRARY_FOLDER_LOCAL"and set the value to be the path to the folder you created in Step 1. For example: <property name="APPS_LIBRARY_FOLDER_LOCAL" value="/Apps Library"/>
4. Restart the bimad (BI Mobile App Designer) application.

You should now see the Mobile Application on the "Home" page and from the "New" dropdown menu. Recent changes to apps appear in the Others section under Recent Dashboards.

Let me know if you have any problems.