Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[#5731]feat(auth-ranger): RangerAuthorizationHDFSPlugin supports Fileset authorization #5733

Merged
merged 37 commits into from
Dec 13, 2024

Conversation

theoryxu
Copy link
Contributor

@theoryxu theoryxu commented Dec 3, 2024

What changes were proposed in this pull request?

RangerAuthorizationHDFSPlugin supports Fileset authorization

Why are the changes needed?

Fix: #5731

Does this PR introduce any user-facing change?

Addition property keys in Fileset

How was this patch tested?

ITs

@xunliu
Copy link
Member

xunliu commented Dec 4, 2024

@theoryxu Please rebase your PR to the latest main branch, Thanks.

@theoryxu
Copy link
Contributor Author

theoryxu commented Dec 4, 2024

@theoryxu Please rebase your PR to the latest main branch, Thanks.

done

@theoryxu theoryxu marked this pull request as ready for review December 5, 2024 09:36
@xunliu xunliu requested a review from jerqi December 11, 2024 11:47
@xunliu
Copy link
Member

xunliu commented Dec 12, 2024

RangerFilesetIT > initializationError FAILED
    java.lang.IllegalArgumentException: Failed to operate catalog(s) [RangerFilesetE2EIT_catalog_1bb7992c] operation [CREATE] under metalake [metalake], reason [Invalid package path: /Users/xun/github/xunliu/gravitino/catalogs/catalog-hadoop/build/libs in [/Users/xun/github/xunliu/gravitino/catalogs/catalog-hadoop/build/libs, /Users/xun/github/xunliu/gravitino/catalogs/catalog-hadoop/build/resources/main, /Users/xun/github/xunliu/gravitino/authorizations/authorization-ranger/build/libs]]
    java.lang.IllegalArgumentException: Invalid package path: /Users/xun/github/xunliu/gravitino/catalogs/catalog-hadoop/build/libs in [/Users/xun/github/xunliu/gravitino/catalogs/catalog-hadoop/build/libs, /Users/xun/github/xunliu/gravitino/catalogs/catalog-hadoop/build/resources/main, /Users/xun/github/xunliu/gravitino/authorizations/authorization-ranger/build/libs]
        at org.apache.gravitino.utils.IsolatedClassLoader.buildClassLoader(IsolatedClassLoader.java:121)
        at org.apache.gravitino.catalog.CatalogManager.createClassLoader(CatalogManager.java:870)
        at org.apache.gravitino.catalog.CatalogManager.createCatalogWrapper(CatalogManager.java:800)
        at org.apache.gravitino.catalog.CatalogManager.lambda$createCatalog$5(CatalogManager.java:393)
        at com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406)
        at java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1916)

I know why CI failed. Because we didn't add catalogs:catalog-hadoop:jar in the authorizaton-ranger-plugin depends.
So Gravitino server didn't success create hdfs catalog in the FilesetITs.

@theoryxu Please use next gradle config to update authorizations/authorization-ranger/build.gradle.kts

tasks.test {
  doFirst {
    environment("GRAVITINO_TEST", "true")
    environment("HADOOP_USER_NAME", "gravitino")
  }
  dependsOn(":catalogs:catalog-hive:jar", ":catalogs:catalog-hive:runtimeJars", ":catalogs:catalog-lakehouse-iceberg:jar", ":catalogs:catalog-lakehouse-iceberg:runtimeJars", ":catalogs:catalog-lakehouse-paimon:jar", ":catalogs:catalog-lakehouse-paimon:runtimeJars", ":catalogs:catalog-hadoop:jar", ":catalogs:catalog-hadoop:runtimeJars")

  val skipITs = project.hasProperty("skipITs")
  if (skipITs) {
    // Exclude integration tests
    exclude("**/integration/test/**")
  } else {
    dependsOn(tasks.jar)
  }
}

@theoryxu
Copy link
Contributor Author

theoryxu commented Dec 12, 2024

RangerFilesetIT > initializationError FAILED
    java.lang.IllegalArgumentException: Failed to operate catalog(s) [RangerFilesetE2EIT_catalog_1bb7992c] operation [CREATE] under metalake [metalake], reason [Invalid package path: /Users/xun/github/xunliu/gravitino/catalogs/catalog-hadoop/build/libs in [/Users/xun/github/xunliu/gravitino/catalogs/catalog-hadoop/build/libs, /Users/xun/github/xunliu/gravitino/catalogs/catalog-hadoop/build/resources/main, /Users/xun/github/xunliu/gravitino/authorizations/authorization-ranger/build/libs]]
    java.lang.IllegalArgumentException: Invalid package path: /Users/xun/github/xunliu/gravitino/catalogs/catalog-hadoop/build/libs in [/Users/xun/github/xunliu/gravitino/catalogs/catalog-hadoop/build/libs, /Users/xun/github/xunliu/gravitino/catalogs/catalog-hadoop/build/resources/main, /Users/xun/github/xunliu/gravitino/authorizations/authorization-ranger/build/libs]
        at org.apache.gravitino.utils.IsolatedClassLoader.buildClassLoader(IsolatedClassLoader.java:121)
        at org.apache.gravitino.catalog.CatalogManager.createClassLoader(CatalogManager.java:870)
        at org.apache.gravitino.catalog.CatalogManager.createCatalogWrapper(CatalogManager.java:800)
        at org.apache.gravitino.catalog.CatalogManager.lambda$createCatalog$5(CatalogManager.java:393)
        at com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2406)
        at java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1916)

I know why CI failed. Because we didn't add catalogs:catalog-hadoop:jar in the authorizaton-ranger-plugin depends. So Gravitino server didn't success create hdfs catalog in the FilesetITs.

@theoryxu Please use next gradle config to update authorizations/authorization-ranger/build.gradle.kts

tasks.test {
  doFirst {
    environment("GRAVITINO_TEST", "true")
    environment("HADOOP_USER_NAME", "gravitino")
  }
  dependsOn(":catalogs:catalog-hive:jar", ":catalogs:catalog-hive:runtimeJars", ":catalogs:catalog-lakehouse-iceberg:jar", ":catalogs:catalog-lakehouse-iceberg:runtimeJars", ":catalogs:catalog-lakehouse-paimon:jar", ":catalogs:catalog-lakehouse-paimon:runtimeJars", ":catalogs:catalog-hadoop:jar", ":catalogs:catalog-hadoop:runtimeJars")

  val skipITs = project.hasProperty("skipITs")
  if (skipITs) {
    // Exclude integration tests
    exclude("**/integration/test/**")
  } else {
    dependsOn(tasks.jar)
  }
}

I tried, but it didn't work. Using getInstanceForTest to set isTestEnv could pass now.

Copy link
Member

@xunliu xunliu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hi @theoryxu Thank you for your contributions.
LGTM

@xunliu xunliu merged commit b151461 into apache:main Dec 13, 2024
23 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Ranger Authorization HDFS Plugin
4 participants