Spark and Accumulo Delegation tokens

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Spark and Accumulo Delegation tokens

Jorge Machado-2
Hi Guys,

I’m on the middle of writing a spark Datasource connector for Apache Spark to connect to Accumulo Tablets, because we have Kerberos it get’s a little trick because Spark only handles the Delegation Tokens from Hbase, hive and hdfs.

Would be a PR for a implementation of HadoopDelegationTokenProvider for Accumulo be accepted ?


Jorge Machado





Reply | Threaded
Open this post in threaded view
|

Re: Spark and Accumulo Delegation tokens

Saisai Shao
I think you can build your own Accumulo credential provider as similar to
HadoopDelegationTokenProvider out of Spark, Spark already provided an
interface "ServiceCredentialProvider" for user to plug-in customized
credential provider.

Thanks
Jerry

2018-03-23 14:29 GMT+08:00 Jorge Machado <[hidden email]>:

> Hi Guys,
>
> I’m on the middle of writing a spark Datasource connector for Apache Spark
> to connect to Accumulo Tablets, because we have Kerberos it get’s a little
> trick because Spark only handles the Delegation Tokens from Hbase, hive and
> hdfs.
>
> Would be a PR for a implementation of HadoopDelegationTokenProvider for
> Accumulo be accepted ?
>
>
> Jorge Machado
>
>
>
>
>
>
Reply | Threaded
Open this post in threaded view
|

Re: Spark and Accumulo Delegation tokens

Jorge Machado-2
Hi Jerry,

where do you see that Class on Spark ? I only found HadoopDelegationTokenManager and I don’t see any way to add my Provider into it.

private def getDelegationTokenProviders: Map[String, HadoopDelegationTokenProvider] = {
  val providers = List(new HadoopFSDelegationTokenProvider(fileSystems),
    new HiveDelegationTokenProvider,
    new HBaseDelegationTokenProvider)

  // Filter out providers for which spark.security.credentials.{service}.enabled is false.
  providers
    .filter { p => isServiceEnabled(p.serviceName) }
    .map { p => (p.serviceName, p) }
    .toMap
}

If you could give me a tipp there would be great.
Thanks

Jorge Machado





> On 23 Mar 2018, at 07:38, Saisai Shao <[hidden email]> wrote:
>
> I think you can build your own Accumulo credential provider as similar to
> HadoopDelegationTokenProvider out of Spark, Spark already provided an
> interface "ServiceCredentialProvider" for user to plug-in customized
> credential provider.
>
> Thanks
> Jerry
>
> 2018-03-23 14:29 GMT+08:00 Jorge Machado <[hidden email]>:
>
>> Hi Guys,
>>
>> I’m on the middle of writing a spark Datasource connector for Apache Spark
>> to connect to Accumulo Tablets, because we have Kerberos it get’s a little
>> trick because Spark only handles the Delegation Tokens from Hbase, hive and
>> hdfs.
>>
>> Would be a PR for a implementation of HadoopDelegationTokenProvider for
>> Accumulo be accepted ?
>>
>>
>> Jorge Machado
>>
>>
>>
>>
>>
>>

Reply | Threaded
Open this post in threaded view
|

Re: Spark and Accumulo Delegation tokens

Saisai Shao
It is in yarn module.
"org.apache.spark.deploy.yarn.security.ServiceCredentialProvider".

2018-03-23 15:10 GMT+08:00 Jorge Machado <[hidden email]>:

> Hi Jerry,
>
> where do you see that Class on Spark ? I only found HadoopDelegationTokenManager
> and I don’t see any way to add my Provider into it.
>
> private def getDelegationTokenProviders: Map[String, HadoopDelegationTokenProvider] = {
>   val providers = List(new HadoopFSDelegationTokenProvider(fileSystems),
>     new HiveDelegationTokenProvider,
>     new HBaseDelegationTokenProvider)
>
>   // Filter out providers for which spark.security.credentials.{service}.enabled is false.
>   providers
>     .filter { p => isServiceEnabled(p.serviceName) }
>     .map { p => (p.serviceName, p) }
>     .toMap
> }
>
>
> If you could give me a tipp there would be great.
> Thanks
>
> Jorge Machado
>
>
>
>
>
> On 23 Mar 2018, at 07:38, Saisai Shao <[hidden email]> wrote:
>
> I think you can build your own Accumulo credential provider as similar to
> HadoopDelegationTokenProvider out of Spark, Spark already provided an
> interface "ServiceCredentialProvider" for user to plug-in customized
> credential provider.
>
> Thanks
> Jerry
>
> 2018-03-23 14:29 GMT+08:00 Jorge Machado <[hidden email]>:
>
> Hi Guys,
>
> I’m on the middle of writing a spark Datasource connector for Apache Spark
> to connect to Accumulo Tablets, because we have Kerberos it get’s a little
> trick because Spark only handles the Delegation Tokens from Hbase, hive and
> hdfs.
>
> Would be a PR for a implementation of HadoopDelegationTokenProvider for
> Accumulo be accepted ?
>
>
> Jorge Machado
>
>
>
>
>
>
>
>
Reply | Threaded
Open this post in threaded view
|

Re: Spark and Accumulo Delegation tokens

Jorge Machado
Thanks !

Should we contribute something like that ?

Jorge Machado
[hidden email]


> Am 23.03.2018 um 12:24 schrieb Saisai Shao <[hidden email]>:
>
> It is in yarn module.
> "org.apache.spark.deploy.yarn.security.ServiceCredentialProvider".
>
> 2018-03-23 15:10 GMT+08:00 Jorge Machado <[hidden email]>:
>
>> Hi Jerry,
>>
>> where do you see that Class on Spark ? I only found HadoopDelegationTokenManager
>> and I don’t see any way to add my Provider into it.
>>
>> private def getDelegationTokenProviders: Map[String, HadoopDelegationTokenProvider] = {
>>  val providers = List(new HadoopFSDelegationTokenProvider(fileSystems),
>>    new HiveDelegationTokenProvider,
>>    new HBaseDelegationTokenProvider)
>>
>>  // Filter out providers for which spark.security.credentials.{service}.enabled is false.
>>  providers
>>    .filter { p => isServiceEnabled(p.serviceName) }
>>    .map { p => (p.serviceName, p) }
>>    .toMap
>> }
>>
>>
>> If you could give me a tipp there would be great.
>> Thanks
>>
>> Jorge Machado
>>
>>
>>
>>
>>
>> On 23 Mar 2018, at 07:38, Saisai Shao <[hidden email]> wrote:
>>
>> I think you can build your own Accumulo credential provider as similar to
>> HadoopDelegationTokenProvider out of Spark, Spark already provided an
>> interface "ServiceCredentialProvider" for user to plug-in customized
>> credential provider.
>>
>> Thanks
>> Jerry
>>
>> 2018-03-23 14:29 GMT+08:00 Jorge Machado <[hidden email]>:
>>
>> Hi Guys,
>>
>> I’m on the middle of writing a spark Datasource connector for Apache Spark
>> to connect to Accumulo Tablets, because we have Kerberos it get’s a little
>> trick because Spark only handles the Delegation Tokens from Hbase, hive and
>> hdfs.
>>
>> Would be a PR for a implementation of HadoopDelegationTokenProvider for
>> Accumulo be accepted ?
>>
>>
>> Jorge Machado
>>
>>
>>
>>
>>
>>
>>
>>