Hello Eric,
Please find my responses in-line:
1. On HANA side for HANA - SPARK integration, do we need anything else besides HANA Spark adapter to be installed ?
Minimum HANA version required is SP10 version 102 or higher.. If you want to use Hadoop for data ageing you need to install Data lifecycle manager (DLM) on HANA.
2. For HANA - Spark integration, is SPARK standalone installation is sufficient or do we need additional installation such as HADOOP, Thrift Servr, Ambari etc.. ?
Hive metastore and Yarn are required.
3. Does HANA - Spark integration, require storage to be HDFS or we could leverage storage such as AWS S3 ?
In order to leaverage AWS S3 storage, you may have to install VORA
4. Would HANA Developer Edition, include feature such as Spark integration or is part of Enterprise edition only ?
I assume developer edition should work though I have not tried on developer edition.
5. Could latest version of Apache Spark 1.6.x be leveraged and supported for integration with HANA or we are restricted to Version 1.4.X/1.5.X of Spark for support/compatibility purposes?
Restricted to Spark 1.4.1 for Spark controller 1.0 and Spark 1.5.2 for spark controller 1.5
Additional information:
Installation:
SAP HANA Spark Controller - SAP HANA Administration Guide - SAP Library
Useful Notes:
<Also checked referenced notes under each note and attachment docs)
2177933
2257657
2273047
HTH
Gopal