Design a multidimensional cube using Schema Workbench in Pentaho CE BI Suite 3.0

This exercise will teach you how to design a new cube, publishing it into the Pentaho server and viewing the cube via Pentaho user console.

This exercise uses Pentaho Schema Workbench 3.0.4 (stable) for designing the cubes, Pentaho CE BI Suite 3.0 for hosting and viewing the cube we design, and ‘SampleData’, a hypersonic SQL database as data source. This ‘SampleData’ database is available as part of the free download of Pentaho CE BI Suite 3.0.

Please follow the below steps to design and view a new cube.

Step 1:

Make sure the pentaho server is up and running.

Step 2:

Once the pentaho server is started, go to the folder where you have the ‘Schema Workbench’ tool installed in your system.

Step 3:

In the ‘schema-workbench’ folder, double click on the batch file ‘workbench’ to startup the ‘schema-workbench’ tool. Or you can right-click on the batch file ‘workbench’ to do the same process. This will open the schema workbench window along with a command prompt. Please maximize the window.

Step 4:

Click on the menu ‘Tools’ and select ‘Preferences’. This will open the ‘Workbench Preferences’ window. We need to provide the JDBC details based on the datasource we use.

Step 5:

In the ‘Workbench Preferences’ window please provide the following details.

Note: As we are using ‘SampleData’ HSQL DB, the details I’ve given is specific to that database. In case if we need to use Oracle, MySQL, etc this will change.

Driver Class Name: org.hsqldb.jdbcDriver

Connection URL: jdbc:hsqldb:hsql://localhost/sampledata

User Name: pentaho_user

Password: password

Schema (Optional): <leave it blank>

Require Schema Attributes: Check this option.

Click on ‘Accept’ button.

Step 6:

To create a new schema file, in the menu bar select ‘File à New à Schema’ menu item.

This will open the ‘New Schema 1’ window with the schema file name as ‘Schema1.xml’. Please refer to the below screenshot.

Step 7:

Click on ‘Schema’ as shown above and set the required properties for it, for ex, name of the schema, etc. For now, enter name as ‘SchemaTest’.

Step 8:

Right click on element ‘Schema’ and select ‘Add Cube’ option. This will add a new cube into the schema.

Step 9:

Set the name of the cube as ‘CubeTest’. Once it is done, the schema design will look like below.

Step 10:

Basically, a cube is a structure made of number of dimensions, measures, etc. Cubes usually rely on two kind of tables like ‘fact-table’ (for cubes) & ‘dimension-table’ (for cube’s different dimensions). A cube can have only one fact-table and ‘n’ number of dimension tables (based on no: of dimensions in the cube).

So our next step is to set the ‘fact-table’ for the cube ‘CubeTest’. To do so, click on the icon before the cube image as mentioned in #2 in above screenshot. This will expand the cube node like below image.

Step 11:

Now click on the ‘Table’ element, this will list out the attributes specific to the ‘Table’ element. Clicking on the ‘name’ attribute will display all tables available under current datasource (the database we set in Step 5. Select the table ‘Customers’.

Once you choose the table ‘PUBLIC -> CUSTOMERS’, the ‘schema’ attribute value will be filled in automatically as ‘PUBLIC’.

Note: If the fact-table doesn’t belong to the schema mentioned in step 5, then you must explicitly specify the schema to which your fact-table belongs to.

Step 12:

Now add a new dimension called ‘CustomerGeography’ to the cube by right clicking the cube element ‘CubeTest’.

Step 13:

For the new dimension added, set the required attribute values like name, foreign key, etc.

Set name of the dimension as ‘CustomerGeography’, and foreign key as ‘CUSTOMERNUMBER’.

Just double click on the dimension name ‘CustomerGeography’.  This will expand the node and display the ‘Hierarchy’.

Click on the ‘hierarchy’ in the left side pane, you can find the attribute properties for the hierarchy.

Set name -> CustomerGeo; allMemberName = ‘All Countries’

Step 14:

Double click on the ‘Hierarchy’ element in the left side pane, will expand the node further and show the ‘Table’ element. Click on the ‘Table’ element to set the dimension-table for the dimension ‘CustomerGeography’. This will list the related attributes on the right side pane. Clicking on the ‘name’ attribute’s value field will list the tables available in the current schema.

Select it as ‘CUSTOMERS’. This will automatically fill the ‘schema’ field as ‘PUBLIC’.

Step 15:

Right click on the element ‘Hierarchy’ on the left side pane and select ‘Add Level’.

This will add a new level with name ‘New Level 0’. Refer to the below screenshot.

To rename and set other attributes, set the attribute values (as listed below) for the newly created level in the right side pane.

Name -> CustomerCountry

Column -> COUNTRY

Type -> string

uniqueMembers -> true

Now we have added a level called ‘CustomerCountry’.

Step 16:

To add another one level, right-click on ‘Hierarchy’ in the left side pane (as we did in Step 15), and select ‘Add Level’. This will add a new level with name ‘New Level 1’. To rename and set other attribute values, set the attribute values in the right side pane as below,

Name -> CustomerCity

Column -> CITY

Type -> String

So far, we have created a cube with a dimension which will show two hierarchical level of details.

Step 17:

To add a new dimension to the cube, right-click on the cube item (CubeTest) in the left side pane then, select ‘Add Dimension’.

This will add a new dimension to the cube with a default name. To rename it and set other attribute values, click on the newly created dimension in the left side pane. This will list out the attributes for the dimension.

Set name -> CustomerContact; foreign key – ‘CUSTOMERNUMBER’.

Step 18:

To add hierarchy and levels for this dimension, double click on the dimension name which will expand the dimension node ‘CustomerContact’. Click on the ‘hierarchy’ element in the left side pane, then on the right side pane set the below attribute values.

Set name -> ‘’; allmembername = ‘All Contacts’.

Step 19:

Double click on the element ‘hierarchy’, will expand the node ‘hierarchy’ where you can set the dimension-table for the dimension ‘CustomerContact’.

Click on the ‘Table’ element and select the table as ‘CUSTOMERS’.

Step 20:

To add a new level for this dimension or hierarchy, right-click on the element ‘hierarchy’ and select ‘Add Level’. This will add a new level to the hierarchy with name ‘New Level 0’. We can rename it by changing the attributes’ values like below,

name -> ‘CustomerNames’

Column -> CONTACTFIRSTNAME

Type -> String.

Step 21:

To add new measure to the cube, right click on the cube ‘CubeTest’ and select ‘Add Measure’. This will add a new measure with name ‘New Measure 0’. You can rename it by changing the attribute values.

For ex, we are trying to achieve the number of customers under each country/city.

Set attribute values like below,

Name -> CustomerCount

Aggregator -> count

Column -> CUSTOMERNUMBER

Format string -> ####

Datatype -> Integer

After setting up the measure, the cube (CubeTest) schema structure will look like below,

Step 22:

Now, click ‘File -> Save’ menu to save the cube schema. You can save it in your desired path.

For ex, save it as ‘TestCube.mondrian.xml’

Step 23:

Once saved the schema, you can then publish the cube into the Pentaho server (we are using Pentaho Community Edition BI Suite 3.0 Stable version).

Select ‘File -> Publish…’ menu item to publish the cube. This will open a publish dialog like below

.

Follow the instructions in the above screenshot and click ‘OK’ button. Click ‘Cancel’ button to cancel the publishing action. Once clicked OK button, you can see the processing action like below,

Then, a publish dialog will open. Here you have to specify, in Pentaho server, where we need to publish the cube.

Choose the location where you want to publish and click on ‘Publish’ button.

On successful publishing, the system will display you a dialog box with message ‘Publish successful’.

Click ‘OK’ button.

Step 24:

To view this published cube via Pentaho, please open or browse or visit the Pentaho server URL.

For ex, http://localhost:8080/pentaho

Click on ‘Pentaho User Console Login’ button, you can find a login dialog box.

Login as user ‘joe(admin)’ and password ‘password’ or any other users with administrative privileges. After selecting the user (or) entering the user credentials, click on ‘Login’ button to login or click on ‘cancel’ button to cancel the login process.

Step 25:

After logging in, the system will be redirected to the Pentaho BI home page.

Click on ‘New Analysis View’ button. This will list of schema accessible to the currently logged in user.

By default, this schema list would include ‘SampleData’, ‘SteelWheels’ along with the schema we published early.

Step 26:

Select the ‘Schema’ as ‘SchemaTest’ and ‘Cube’ as ‘CubeTest’. Click ‘OK’ button.

Step 27:

This will generate the cube in an ‘Analysis View’ window like below.

Result:

A new cube has been designed, configured and published into the Pentaho Server. Also, we viewed the cube via Pentaho User Console.

(function(i,s,o,g,r,a,m){i[‘GoogleAnalyticsObject’]=r;i[r]=i[r]||function(){
(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
})(window,document,’script’,’//www.google-analytics.com/analytics.js’,’ga’);

ga(‘create’, ‘UA-44966059-1’, ‘wordpress.com’);
ga(‘send’, ‘pageview’);

34 thoughts on “Design a multidimensional cube using Schema Workbench in Pentaho CE BI Suite 3.0

  1. How can I use the dimension which is in the same fact table…I have the date field in the fact table…Now I have problems to use the data field in the dimension by making the same table in the cube table…

  2. What a great web log. I spend hours on the net reading blogs, about tons of various subjects. I have to first of all give praise to whoever created your theme and second of all to you for writing what i can only describe as an fabulous article. I honestly believe there is a skill to writing articles that only very few posses and honestly you got it. The combining of demonstrative and upper-class content is by all odds super rare with the astronomic amount of blogs on the cyberspace.

  3. Hi, Thanks for your prompt response. I was able to set up the publisher password.
    I am a newbie to BI and would appreciate some more help from you.
    Why is there a need to specify a datasource while publishing. Won’t the schema itself defines from which tables take the fact data and the dimension data. Also when I am creating a new datasource in pentaho user console, apart from specifying the connection, I have to also specify a query. What is this query for ??

    • Hi, While Publishing, we should create a Data-source which is configured in BI Server.. Because, your cubes in BI Server, will communicate to database only through datasource and your data-source connection can be modified at any point of time (Dev,UAT,Production)..

  4. After creating a DS, there comes a Data Source Model window where I can define the Measures, Dimensions and Hierarchies. Since I want to set up the schema through the work-bench, isn’t there a way around this model.

  5. Thanks! I got confused between data source and data connection. What I created in BI Server was called a data connection which should be the same while publishing through the workbench.

    If I create a data source in BI server, then I have to explicitly give some query along with the data connection.

  6. I m working on Modrian schema workbench in small company , is having a good future or what or i m thinking to go for M.Tech? plz give me suggestion

  7. Hi i am follow the steps and successfully published ‘schema’ from schema workbench, when i am create a ‘New Analysis View’ for that created schema i got an error like this “An error occurred while rendering Pivot.jsp. Please see the log for details.”

    I am using biserver-ce-3.10.0-stable, i wait for your reply, pls guide me and give ur suggestion

  8. hi,
    thanks for ur reply,
    i dont what is happening i succesfully publish it but still the error persist
    pentaho.log ………………

    2012-06-06 00:27:52,643 WARN [org.pentaho.platform.engine.services.runtime.TemplateUtil] not found: [Measures].[Actual]
    2012-06-06 00:28:13,646 WARN [org.pentaho.platform.engine.services.runtime.TemplateUtil] not found: [Measures].[Sales]
    2012-06-06 00:28:13,737 ERROR [com.tonbeller.wcf.component.RendererTag] trouble rendering table2ce7adb0-afa9-11e1-9841-e9489cb4591a
    com.tonbeller.jpivot.olap.model.OlapException: mondrian.olap.MondrianException: Mondrian Error:Internal error: Error while executing query [select NON EMPTY {[Measures].[Sales]} ON COLUMNS,
    NON EMPTY {([Customers].[All Customers], [Time].[All Years])} ON ROWS
    from [WheelSteelSales]
    ]
    at com.tonbeller.jpivot.mondrian.MondrianModel.getResult(MondrianModel.java:307)
    at com.tonbeller.jpivot.olap.model.OlapModelDecorator.getResult(OlapModelDecorator.java:54)
    at com.tonbeller.jpivot.olap.model.CachingOlapModel.getResult(CachingOlapModel.java:48)
    at com.tonbeller.jpivot.olap.model.OlapModelDecorator.getResult(OlapModelDecorator.java:54)
    at com.tonbeller.jpivot.table.TableComponent.updateOlapModel(TableComponent.java:246)
    at com.tonbeller.jpivot.table.TableComponent.render2(TableComponent.java:251)
    at com.tonbeller.jpivot.table.TableComponent.render(TableComponent.java:235)
    at com.tonbeller.wcf.component.RendererTag.doEndTag(RendererTag.java:137)
    at org.apache.jsp.jsp.Pivot_jsp._jspService(Pivot_jsp.java:2750)
    at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:377)
    at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:313)
    at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:260)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.PentahoWebContextFilter.doFilter(PentahoWebContextFilter.java:102)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.PentahoRequestContextFilter.doFilter(PentahoRequestContextFilter.java:84)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:378)
    at org.springframework.security.intercept.web.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109)
    at org.springframework.security.intercept.web.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.ui.ExceptionTranslationFilter.doFilterHttp(ExceptionTranslationFilter.java:101)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.pentaho.platform.web.http.security.SecurityStartupFilter.doFilter(SecurityStartupFilter.java:103)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.providers.anonymous.AnonymousProcessingFilter.doFilterHttp(AnonymousProcessingFilter.java:105)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.pentaho.platform.web.http.security.RequestParameterAuthenticationFilter.doFilter(RequestParameterAuthenticationFilter.java:169)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.ui.basicauth.BasicProcessingFilter.doFilterHttp(BasicProcessingFilter.java:174)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.ui.AbstractProcessingFilter.doFilterHttp(AbstractProcessingFilter.java:278)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.ui.logout.LogoutFilter.doFilterHttp(LogoutFilter.java:89)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.pentaho.platform.web.http.security.HttpSessionReuseDetectionFilter.doFilter(HttpSessionReuseDetectionFilter.java:134)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.context.HttpSessionContextIntegrationFilter.doFilterHttp(HttpSessionContextIntegrationFilter.java:235)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.wrapper.SecurityContextHolderAwareRequestFilter.doFilterHttp(SecurityContextHolderAwareRequestFilter.java:91)
    at org.springframework.security.ui.SpringSecurityFilter.doFilter(SpringSecurityFilter.java:53)
    at org.springframework.security.util.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:390)
    at org.springframework.security.util.FilterChainProxy.doFilter(FilterChainProxy.java:175)
    at org.springframework.security.util.FilterToBeanProxy.doFilter(FilterToBeanProxy.java:99)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at com.pentaho.ui.servlet.SystemStatusFilter.doFilter(SourceFile:72)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.pentaho.platform.web.http.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:113)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
    at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:470)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298)
    at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:857)
    at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588)
    at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
    at java.lang.Thread.run(Unknown Source)
    Caused by: mondrian.olap.MondrianException: Mondrian Error:Internal error: Error while executing query [select NON EMPTY {[Measures].[Sales]} ON COLUMNS,
    NON EMPTY {([Customers].[All Customers], [Time].[All Years])} ON ROWS
    from [WheelSteelSales]
    ]
    at mondrian.resource.MondrianResource$_Def0.ex(MondrianResource.java:942)
    at mondrian.olap.Util.newInternal(Util.java:2261)
    at mondrian.olap.Util.newError(Util.java:2277)
    at mondrian.rolap.RolapConnection.executeInternal(RolapConnection.java:712)
    at mondrian.rolap.RolapConnection.access$000(RolapConnection.java:51)
    at mondrian.rolap.RolapConnection$1.call(RolapConnection.java:628)
    at mondrian.rolap.RolapConnection$1.call(RolapConnection.java:627)
    at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
    at java.util.concurrent.FutureTask.run(Unknown Source)
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
    … 1 more
    Caused by: mondrian.olap.MondrianException: Mondrian Error:Internal error: Error while loading segment; sql=[select sum(“orderfact”.”TOTALPRICE”) as “m0” from “orderfact” as “orderfact”]
    at mondrian.resource.MondrianResource$_Def0.ex(MondrianResource.java:942)
    at mondrian.olap.Util.newInternal(Util.java:2261)
    at mondrian.olap.Util.newError(Util.java:2277)
    at mondrian.rolap.SqlStatement.handle(SqlStatement.java:336)
    at mondrian.rolap.SqlStatement.execute(SqlStatement.java:236)
    at mondrian.rolap.RolapUtil.executeQuery(RolapUtil.java:318)
    at mondrian.rolap.agg.SegmentLoader.createExecuteSql(SegmentLoader.java:555)
    at mondrian.rolap.agg.SegmentLoader.loadImpl(SegmentLoader.java:187)
    at mondrian.rolap.agg.SegmentLoader.access$000(SegmentLoader.java:48)
    at mondrian.rolap.agg.SegmentLoader$SegmentLoadCommand.call(SegmentLoader.java:155)
    at mondrian.rolap.agg.SegmentLoader$SegmentLoadCommand.call(SegmentLoader.java:129)
    … 5 more
    Caused by: java.sql.SQLException: Table not found in statement [select sum(“orderfact”.”TOTALPRICE”) as “m0” from “orderfact”]
    at org.hsqldb.jdbc.Util.sqlException(Unknown Source)
    at org.hsqldb.jdbc.jdbcStatement.fetchResult(Unknown Source)
    at org.hsqldb.jdbc.jdbcStatement.executeQuery(Unknown Source)
    at org.apache.commons.dbcp.DelegatingStatement.executeQuery(DelegatingStatement.java:205)
    at mondrian.rolap.SqlStatement.execute(SqlStatement.java:175)
    … 11 more

    • Hi, From your log I believe that datasource connection is not set properly. Please check that..

      snippet from your log…
      Caused by: java.sql.SQLException: Table not found in statement [select sum(“orderfact”.”TOTALPRICE”) as “m0” from “orderfact”]

  9. Hi, my cube works fine with all levels defined. I do not get any errors and in analyser, its all good unless i use sub-total. when i do subtotal, pentaho analyser doesnt show any values. Is there something wrong with my levels? what is the significance of unique members in defining levels?

  10. Hi herwinrayen

    when i tried to see analysis view on pentaho user console ..Error
    :This page cannot be accessed directly. It must be linked to from other pages
    please help me

  11. I know its been a while since you wrote this, but hopefully you’re still following the comments. This was a very well written tutorial. Thanks for your effort. I’ve looked at a number of Mondrian tutorials, and always been a little cloudy. This is the first time I’ve

    Any hints on setting up a multi-level (parent-child) hierarchy? An explanation of what would happen after Step 20 above would be great.

    Thanks again!

  12. Hi….. this is a nice post…..

    How to load our own data to mondrian instead of foodmart data as it is important for our project…..?

    Thanks in advance….

  13. Your article was a great help…One questions… when I have the Publish Schema window open (last step) and at the bottom it asks for Pentaho or JNDI Source: …..what do I use here?

    • good approach is to publish schema file with JNDI datasource and this datasource should be preconfigured in Admin console. For example, if your schema file is using MYSQL database then you have to configure the MYSQL db connection in Admin console, let this datasource name be MYSQL_CON and we should use the datasource “MYSQL_CON” as JNDI name while publishing schema file in server.

  14. Nice article!, Sir, I have a question, is there a way to filter Fact table rows based on user role? I tried to figure out and the only solution that I found is defining USER ROLES on the XML descriptor file.

    Thanks in advance.

  15. Thanks for any other informative site. The place else
    may just I am getting that kind of information written in such a perfect manner?

    I’ve a mission that I’m simply now running on, and I’ve
    been at the look out for such information.

Leave a comment