Hello Deepak,
This little hint indeed helped save hours of my time.
Thank you!
Hello Deepak,
This little hint indeed helped save hours of my time.
Thank you!
Hi ,
One quick question this.
I see AWS cost calculator for HANA instance and by default it comes around $112 Monthly but I want to make sure when you create instance using SAP ID and following step-by-step instruction and finally at the time of launch it is asking for AWS KEY and SECRET KEY. My question is after following entire process does it generate the same HANA Sever specification instance which is mentioned in below link? (Such as 8 Cores, 64 GB RAM, 127 GB Storage)
Any idea , how much I will end up paying monthly?
- Dharmesh
Hi,
Thank you for your interest in my problem.
I have workbook in Analyer and in this workbook I have many filters,
for example in first filter I have two choices: "dry" or "wet" . In second filter I can choose products.
Before hanna when in first filter I chose "wet" in second filter could only choose wet products (the list of possible chose was limited to wet products), now I have list to choose with all products wet and dry.
Selection of features in the first filter, limited possibility to choose in the second filter ect. Now this not work.
this is of course only example. I have more complicated classification of products and many filters, so this is for me problem.
I'm sorry for my not clear explanation ![]()
Hi Izabella,
Thanks for you explanation I understood what is happening but still some key points to understand how can we help (and if we can)
Perhaps, you are talking too much from the user perspective but we need more technical details. Despite I'm not an BI expert I'll do some more questions and with your answer another colegue can help.
Is this Bex Analyzer? Or Analysis for Microsoft Excel?
What changed despite the DB?
Is it a SAP BW that migrated to SAP BW on HANA?
The source you are querying is a custom one? Did your implementation remodeled to run with HANA information views?
Regards, Fernando Da Rós
Hi Ros,
Thank you for your willingness to help.
I wanted to answer your questions, but I've just got info from our IT department that we implemented next note and this time it solved our problem.
This note is:
2074430 ( ver. 3) Technical enhancement for input help
already have implemented such notes:
2082804 SQL internal parse tree depth exceeds its maximum: join depth exceeds its maxim
2074430 ( ver. 2) Technical enhancement for input help
2048624 F4: When you execute the input help, data that does not correspond to the read
2044626 Input help returns the following error: "sid based seldr but no sid filter allo
dodatkowo
2017665 Incorrect data with master data provider for InfoProvider navigation attributes
2047764 Input help for hierarchy node does not acess HANA/BWA
2008763 Input help in mode D does not work for navigation attributes
2068034 Long runtimes for value help based on an SAP HANA model
BR
Iza
Hi Izabella,
Thanks to come back with the solution.
Next time please give the background information on your first post.
Best regards, Fernando Da Rós
Suren,
We have no clue what is in PARTITION_RANGE in your table PARTITION_MAP. So, we cannot know how the ALTER TABLE statement effectively looks like.
To debug your code, I suggest to make this statement visible and look for the error in it.
I was able to have this thing running with the correct syntax, so I'm pretty sure it is about how you create the ALTER TABLE statement.
- Lars
Hi Experts
I have written a HTML-Javascript code that return LinkedIN Profile details in JSON format.Now i want to add those data into my table. Should i convert them into XSJS type if so can you guide me through the steps? Sholud i write xshttpdest? Can you guide sample code will be really helpful.
My Code:
<html>
<head>
<title>LinkedIn JavaScript API Hello World</title>
<!-- 1. Include the LinkedIn JavaScript API and define a onLoad callback function -->
<script type="text/javascript" src="https://platform.linkedin.com/in.js">/*
api_key: 75eq444tllkike
onLoad: onLinkedInLoad
authorize: true
credentials_cookie: true
credentials_cookie_crc: true
*/</script>
<script type="text/javascript">
function onLinkedInLoad() {
IN.Event.on(IN, "auth", onLinkedInAuth);
}
function onLinkedInLogout() {
setConnections({}, {total:0});
}
function onLinkedInAuth() {
IN.API.Connections("me")
.fields("firstName", "lastName", "industry","summary","headline","specialties","positions","interests","educations")
.result(displayConnections)
.error(displayConnectionsErrors);
}
function displayConnections(connections) {
var connectionsDiv = document.getElementById("connections");
var members = connections.values; // The list of members you are connected to
/*for (var member in members) {
connectionsDiv.innerHTML += "<p>" + members[member].firstName + " " + members[member].lastName+ " works in the " + members[member].industry + " industry";}*/
document.write(JSON.stringify(members));
}
function displayConnectionsErrors(error) { /* do nothing */ }
</script>
</head>
<body>
<!-- 3. Displays a button to let the viewer authenticate -->
<script type="IN/Login"></script>
<!-- 4. Placeholder for the greeting -->
<div id="profiles"></div>
<div id="connections"></div>
</body>
</html>
Kindly Assist
Hi HANA experts,
I'm trying to create a simple hdbprocedure but when I try to activate it, it gives me the following error:
Could not create catalog object: insufficient privilege; Not authorized
Do you guys know what privileges I would need to add to my user? Thanks.
Hi Lars and Naresh
Thank you very much for your help.
I was able to resolve it with installing Java 1.7.
I had set the PATH also just in case.
Thanks and regards
Vikas.
We are installing Hana Client for Windows, and we have an error message indicating that there is not enough space left (image attached). We already tried to install 64bit and 32 bit Version and shows the same error message. We were working on HANA REV 69 and we upgrade to REV 74. Any help will be appreciated.
BTW, we are on HANA v 1.0.7402.
Hi Harshawardhan
If you are on SPS07 which is our first release with spatial capabilities, a workaround is not really possible. For SPS08, you can work with WKT-strings and ST_GeomFromWKT instead of native ST_Point columns, but i would recommend to wait for SPS09.
Best regards,
Hinnerk
this may not be a public information just yet, but I figured I would ask just in case. are there any plans (negotiations) to put HANA, preferably SP09, on SL cloud?
not answering may be interpreted as a yes among some folks.
thx,
greg
kemal, Thank you for response , i will try your code.
Rahul,
Not only does the executing userid need the _SYS_REPO procedure execution privilege, the _SYS_REPO userid needs grant option for all actions on all schemas that are in the role which is being granted, for instance:
GRANT <privilege-list> ON SCHEMA <schema-name> to _SYS_REPO with GRANT OPTION;
Of course, if you are using hdbschema, etc. files to create your work products, they will already reside in _SYS_REPO's domain.
Hi Ibrahim,
Firstly you should create a file named: logout.xscfunc and paste the following code:
{
"library": "libxsauthenticator",
"factory": "createApp",
"method": "logout"
}
Then use this function in your shell:
logout:function(){
jQuery.ajax({
url: "/tbox/logout.xscfunc", // for different servers cross-domain restrictions need to be handled
type: 'POST',
dataType: "json",
success: function(data, s, xhr) { // callback called when data is received
document.location.reload(true);
},
error: function(jqXHR, textStatus, errorThrown) {
document.location.reload(true);
}
});
},
Hope this helps.
Please change new sap.viz.ui5.VizContainer with new sap.viz.ui5.Column. it will be okey
Hi,
I want to check if data from Hana can be exported to Cognos TM1. As TM1 has more analysis functions for actual and Plan fianced data ( as BW SEM).
Can anyone please advise on pros and cons of the below scenarios or is it possible to integrate in below method.
1) Can Cognos TM1 can read directly integrate with HANA and load the data. If yes pls give an outline on steps
2) Can TurboIntegrator ( similar to BODS ) can read data from Hana and then load to Cognos TM1. TurboIntegrator and Cognos setup is already in place only need to check if TurboIntegrator can read the views in Hana.
3)Can BODS pull data from Hana into data staging and then load to Cognos. Can BODS load data using turboIntegator into Cognos or load data from BODS into Cognos TM1 directly.
4) Any pros and con on loading data from HANA to Cognos TM1 using ODBC Concept.
I know that data export from hana is not and ideal methodology but business requies it and we have to load data from HANA to Cognos TM1 not from ECC to Cognos .
Any help would be greatly appreciated.
Thanks,
Venkat.