Quantcast
Channel: SCN: Message List - SAP HANA Developer Center
Viewing all 9165 articles
Browse latest View live

xsjs SQL performance issue

$
0
0

Hi guys,

 

I have a issue with xsjs SQL performance.

 

I have a simple sql query,

SELECT count(*) FROM "demo"."demo::view" WHERE "StateID"=5

It finishs execution in 400 ms in HANA SQL console.

 

When I put the query in xsjs,

conn = $.db.getConnection();

query = 'SELECT count(*) FROM "demo"."demo::view" WHERE "StateID"=5';

pstmt = conn.prepareStatement(query);

rs = pstmt.executeQuery();

The xsjs service also responds in 400 ms.

 

However, if I get request parameter and use the parameter to compose the query with pstmt.setInteger()

stateID = $.request.parameters.get('StateID');

conn = $.db.getConnection();

query = 'SELECT count(*) FROM "demo"."demo::view" WHERE "StateID"=?';

pstmt = conn.prepareStatement(query);

pstmt.setInteger(1, stateID);

rs = pstmt.executeQuery();

The service execution time becomes more than 2.5 seconds.

 

Do I miss something here on calling sql in xsjs?


Re: xsjs SQL performance issue

$
0
0

This is correct as these two statements produces different entries in execution plan cache.

Try to launch XS version several times - should come to 400 ms as well.

Re: HANA Service "sapstatsrv" can not start

Re: xsjs SQL performance issue

$
0
0

Hi Dzianis,

 

We did launch it several times but it does not change. 2.5 seconds is the average time taken for the query.

Re: HEX String to INTEGER conversion

$
0
0

Yes, thanks for the input and this is exactly what I tried to do, but it yields:

 

Could not execute 'CREATE FUNCTION hexstr2int (IN i_hex VARCHAR(2000)) RETURNS o_result BIGINT LANGUAGE SQLSCRIPT SQL ...' in 1.603 seconds .

SAP DBTech JDBC: [7]: feature not supported: Scalar UDF does not support the type as its input/output argument: Varchar3

 

on the HANA version I currently use in the cloud. It seems that older HANA versions only support primitive datatypes for scalar UDFs (so varchar is not supported).

 

Besides, my hope was that HANA provides a speedy/internal datatype conversion out of the box (or even a numerical hash function) but it seems that this hope was in vain. I can imagine many use-cases where I would want to have a numeric hash based partitioning. This seems to be somewhat outdated in the context of todays memory sizes where 32 bytes space to store a hash value is ok even if I only plan to have 8 partitions with uniform distribution.

 

Again, Lars I really appreciate your input and in fact my procedure doesn't fit my need but your UDF does. But it is currently not supported by the HANA cloud version.

 

Any clever tricks to solve this with HANA internal typeconversion?

 

Best regards,

Bodo

Re: HEX String to INTEGER conversion

$
0
0

Correct, character-type parameters for UDFs are supported as of SPS 8.

I thought the cloud version would be on SPS 8 by now - but apparently it's not

 

I see the use case, but there's currently no string-unpacking data type conversion available, except for date/time data types.

 

About the partitioning: what exactly do you want to do?

Distribute data evenly into table partitions?

Partition type HASH does that already automatically.

 

- Lars

Re: HEX String to INTEGER conversion

$
0
0

My current use-case is in the context of a PoC where I generate testdata with configurable data volumes in a generic but deterministic way (a non real-life scenario). I know that HANA provides internal partitioning out of the box (e.g., the PARTITION BY HASH (a) PARTITIONS 12 option for table creation). Is there a way to explicitly apply partition pruning in a select statement, e.g. to enable for parallel reading with an external ETL Tool? In Oracle I would use a PARTITION clause to explicitly prune a query to a certain hash partition.

 

I admit that this might not be a common use-case and this may not be needed in many real-life scenarios. But in the past explicite hashing/pruning/parallelization came handy in some performance critical ETL situations.

 

Many thanks,

Bodo

Re: How to execute a MDX query in SAP HANA Studio

$
0
0

So I executed this series of sql:

 


set 'MDX_CATALOG_NAME' = 'testing';

 

MDX SELECT

{

[MEASURES].STO_OPEN_QUANTITY

} on COLUMNS,

{

[WERKS].MEMBERS

} on ROWS

from [CA_PFEP_STO_ALL];

 

MDX GET_AXISDATA 1 546DD7D434738C9CE1000000C6070FEB;

 

MDX GET_CELLDATA 546DD7D434738C9CE1000000C6070FEB;

 

MDX GET_AXISDATA 1 546DD7D434738C9CE1000000C6070FEB;

 

and they all produce data, but what I'm not getting is why this is? I see in the backend that the threads being generated to produce the pieces.

 

Is it like this so the 3rd party app can piece it together? It looks to me that HANA is breaking down the MDX into pieces and executing it.

 

So I'm not getting the reason for setting the MDX_CATALOG_NAME and the GUID's.

 

Can anyone explain?

 

I'd like to just produce a resultset from the MDX Select.

 

Mike


How to setup a table with Time constraint

$
0
0

Hi,

how to create a table with time constraint in SAP HANA so with validity period for the set of keys? I have not managed to link together the begin and end date to form a validity interval which particularly in HR tables is a must in order to pick up the record which is valid on a specific date.


So taking a simple example; the keys of the table are EMPLOYEEID and the time range between the BEGIN and END dates.

So this would be possibe;

EMPLOYEEID / BEGINDATE / ENDDATE / VALUE

12345678 / 2013-01-01 / 2013-12-31 / ValueA

12345678 / 2014-01-01 / 2014-09-16 / ValueB

12345678 / 2014-09-17 / 2014-12-01 / ValueC

23456789 / 2010-01-01 / 2015/01-01 / ValueA

...


but this would not as it violates the time constraint of the key(s) as the dates overlap:

12345678 / 2014-01-01 / 2014-12-31 / ValueA

12345678 / 2014-02-02 / 2014-09-16 / ValueB

...


Please note that just defining the BEGIN and END dates as keys is not sufficent as per in the above example we could not just look at the individual days as values but to the date interval they comprise. 

Re: SAP HANA - Does history tables records stored internally in same table?

Re: invalid datatype: "LOCATION" SPATIAL type in GROUP BY clause

$
0
0

Not really Lars, but I just want to see the data which I have inserted in the table, from Analytic view, so that I can access this to expose over web service.

Does this mean, if I make LOCATION as my primary key, this issue will get solved.

 

Harshawardhan

Re: Toggle line breakpoint error

$
0
0

You should first checkout and import the file in the "Repositories" View.

Then open the file from the "Project Explorer" and set the breakpoint

Re: invalid datatype: "LOCATION" SPATIAL type in GROUP BY clause

$
0
0

If you just want to see the data in the table, why don't you skip the analytic view altogether? It's not the right tool for this.

 

And how do you think would making the LOCATION primary key change its data type?

It wouldn't change a bit about the fact that it is simply not possible to group by columns of ST_POINT type.

 

In your web service just run a select against the table.

Re: invalid datatype: "LOCATION" SPATIAL type in GROUP BY clause

$
0
0

This is how my flow: Table-> analytic view -> .xsodata (to expose this view) -> .js(Ajax call to .xsodata web service) ->.html

 

I am only aware of below string in .xsodata for analytic view to make data available over web.

 

service {

       "<package_name>/<AnalyticView_Name>.analyticview" as "<call_v>" keys generate local "GID" aggregates always;

 

}

It would be great if you let me know the string/codefor  .xsodata file, which can expose column table in hana instead of analytic view. I tried to serach it, but no luck!

 

Harshawardhan

Re: invalid datatype: "LOCATION" SPATIAL type in GROUP BY clause

$
0
0

Can't really say why you didn't find this during your extensive search but in the SAP HANA Development Concepts guide, Chapter "5.1 Data Access with OData in SAP HANA XS" - "5.1.1. OData in SAP HANA XS" I find this:

 

Database objects

Expose an object using the object's database catalog (runtime) name. The support for database objects is mainly intended for existing or replicated objects that do not have a repository design-time representation. The following example shows how to include a reference to a table in an OData service definition using the table's runtime name.

 

service {

     "mySchema"."myTable" as "MyTable";

     }

 

...

 

A hammer is a hammer is a hammer...

 

I think the core question around the data preview error message has been resolved now.

This thread can be closed.

 

- Lars


Re: xsjs Library "sap.hana.xsopen.*"

$
0
0

Yes we used this library at TechEd as a preview of a new SPS09 library. It will ship standard in SPS09 in a few weeks. The library path name will change slightly in the shipping version.  It will be sap.hana.xs.libs.dbutils.xsds,

Re: Push notification

$
0
0

WebSockets is still in our product backlog. I can't comment further on when it might arrive. Instead of push you can consider some form of polling mechanism.

HANA XS Outbound Http Issue

$
0
0

Hello all,

 

We are unable to call http destination from XS .

 

Below is the hana log:

XsIpConnection   IPConnection.cpp(00072) : IPconnection can not initialize libsapcrypto. rc = -40

 

In XSJS service:

HttpClient.request: request failed. The following error occured: unable to establish connection to <host>:443 - internal error code: Secure store initialization failed

 

Cyprto is installed successfully and we were able to add and authenticate the Http destination in XS Admin tool .

 

Can you please provide your inputs to resolve this?

 

Thanks,

Ramya

Re: HANA XS Outbound Http Issue

$
0
0

Left Outer join

$
0
0

Hi,

 

I have a small requirement which I want to achieve.

 

I have 2 tables table1 and table2.

 

Table1.:

 

A_ID         Desc1

 

1               ABC

1               DEF

2               GHI

2               JKL

3               MNO

 

Table2:

 

A_ID         Desc2

 

1               ghgh

2               rerer

 

After applying logic I want only those entries which exists in Table1 but not in Table2.

 

Final expected O/P:

 

A_ID         Desc1          Desc2

3               MNO            ?

 

Can someone please help me with the logic.

 

I dont think this can be achieved with the help of graphical calculation view , maybe by SQL one.

 

BR

Sumeet

8007236600

Viewing all 9165 articles
Browse latest View live