Skip to content

Commit

Permalink
Cleanup redundant maybeCacheToken
Browse files Browse the repository at this point in the history
`maybeCacheToken` has a misleading name, it actually is only an
optional initial token which will be used until it expires

https://github.com/cognitedata/cognite-sdk-scala/blob/2c1e74e67b3eba42ca564f53af08021ba146c642/src/main/scala/com/cognite/sdk/scala/common/OAuth2.scala#L173
https://github.com/cognitedata/cognite-sdk-scala/blob/2c1e74e67b3eba42ca564f53af08021ba146c642/src/main/scala/com/cognite/sdk/scala/common/OAuth2.scala#L200

But after it expires or if it wasn't provided normal path would be
used to fetch a new token and persist it via `CachedResource` and
invalidate-refresh as needed in `commonGetAuth`
https://github.com/cognitedata/cognite-sdk-scala/blob/2c1e74e67b3eba42ca564f53af08021ba146c642/src/main/scala/com/cognite/sdk/scala/common/OAuth2.scala#L133

The intent is to be able to have single seed token for multiple auth instances.

Here initial token was supplied out of normal path so exactly the same path
as expire-refresh, so removing manually setting initial token would only
save some eager requests and extra expiration checks on each refresh
(current version of fetching token checks old initial token expiration
every single call even after it has expired long time ago)

To address separately in scala sdk later:
- maybe unwrap `F[Option[initial token]]` back to `Option[initial token]`
- rename `maybeCacheToken` to something like `maybeInitialToken`
- add support for initial token in sdk to avoid having to hijack
  refresh path in consumer code each time
  • Loading branch information
dmivankov committed Aug 17, 2023
1 parent e84520f commit 62435e9
Showing 1 changed file with 4 additions and 18 deletions.
22 changes: 4 additions & 18 deletions src/main/scala/cognite/spark/v1/CdfSparkAuth.scala
Original file line number Diff line number Diff line change
Expand Up @@ -16,40 +16,26 @@ object CdfSparkAuth {
IO(AuthProvider(auth))
}

import CdpConnector.ioRuntime

final case class OAuth2ClientCredentials(credentials: OAuth2.ClientCredentials)(
implicit sttpBackend: SttpBackend[IO, Any])
final case class OAuth2ClientCredentials(credentials: OAuth2.ClientCredentials)
extends CdfSparkAuth {
private val cacheToken = Some(
IO.pure(credentials.getAuth[IO]().attempt.map(_.toOption).unsafeRunSync()))

override def provider(
implicit clock: Clock[IO],
sttpBackend: SttpBackend[IO, Any]): IO[AuthProvider[IO]] =
OAuth2.ClientCredentialsProvider[IO](credentials, maybeCacheToken = cacheToken)
OAuth2.ClientCredentialsProvider[IO](credentials)
}

final case class OAuth2Sessions(session: OAuth2.Session)(implicit sttpBackend: SttpBackend[IO, Any])
final case class OAuth2Sessions(session: OAuth2.Session)
extends CdfSparkAuth {

private val refreshSecondsBeforeExpiration = 300L

private val cacheToken = Some(
IO.pure(
session
.getAuth[IO](refreshSecondsBeforeExpiration = refreshSecondsBeforeExpiration)
.attempt
.map(_.toOption)
.unsafeRunSync()))

override def provider(
implicit clock: Clock[IO],
sttpBackend: SttpBackend[IO, Any]): IO[AuthProvider[IO]] =
OAuth2.SessionProvider[IO](
session,
refreshSecondsBeforeExpiration = refreshSecondsBeforeExpiration,
maybeCacheToken = cacheToken
refreshSecondsBeforeExpiration = refreshSecondsBeforeExpiration
)
}
}

0 comments on commit 62435e9

Please sign in to comment.