Skip to content

Conversation

@anaconda-renovate
Copy link

@anaconda-renovate anaconda-renovate bot commented Jul 31, 2023

This PR contains the following updates:

Package Type Update Change
org.apache.hadoop:hadoop-common compile minor 3.3.2 -> 3.4.0
renovate update details

Field Value
manager maven
categories java
datasource maven
depName org.apache.hadoop:hadoop-common
depType¹ compile
packageName org.apache.hadoop:hadoop-common
sourceUrl
updateType minor
versioning maven

¹ only available for some managers

GitHub Vulnerability Alerts

CVE-2022-25168

Apache Hadoop's FileUtil.unTar(File, File) API does not escape the input file name before being passed to the shell. An attacker can inject arbitrary commands. This is only used in Hadoop 3.3 InMemoryAliasMap.completeBootstrapTransfer, which is only ever run by a local user. It has been used in Hadoop 2.x for yarn localization, which does enable remote code execution. It is used in Apache Spark, from the SQL command ADD ARCHIVE. As the ADD ARCHIVE command adds new binaries to the classpath, being able to execute shell scripts does not confer new permissions to the caller. SPARK-38305. "Check existence of file before untarring/zipping", which is included in 3.3.0, 3.1.4, 3.2.2, prevents shell commands being executed, regardless of which version of the hadoop libraries are in use. Users should upgrade to Apache Hadoop 2.10.2, 3.2.4, 3.3.3 or upper (including HADOOP-18136).

CVE-2024-23454

Apache Hadoop’s RunJar.run() does not set permissions for temporary directory by default. If sensitive data will be present in this file, all the other local users may be able to view the content. This is because, on unix-like systems, the system temporary directory is shared between all local users. As such, files written in this directory, without setting the correct posix permissions explicitly, may be viewable by all other local users.

CVE-2022-26612

In Apache Hadoop, The unTar function uses unTarUsingJava function on Windows and the built-in tar utility on Unix and other OSes. As a result, a TAR entry may create a symlink under the expected extraction directory which points to an external directory. A subsequent TAR entry may extract an arbitrary file into the external directory using the symlink name. This however would be caught by the same targetDirPath check on Unix because of the getCanonicalPath call. However on Windows, getCanonicalPath doesn't resolve symbolic links, which bypasses the check. unpackEntries during TAR extraction follows symbolic links which allows writing outside expected base directory on Windows. This was addressed in Apache Hadoop 2.10.2, 3.2.3, 3.3.3, and 3.4.0.


Configuration

📅 Schedule: Branch creation - "" in timezone UTC, Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.


  • If you want to rebase/retry this PR, check this box

This PR has been generated by Renovate Bot.

@anaconda-renovate anaconda-renovate bot changed the title chore(deps): update dependency org.apache.hadoop:hadoop-common to v3.3.3 [security] chore(deps): update dependency org.apache.hadoop:hadoop-common to v3.4.0 [security] Sep 26, 2024
@anaconda-renovate anaconda-renovate bot force-pushed the renovate/maven-org.apache.hadoop-hadoop-common-vulnerability branch from d340afd to a23eb27 Compare September 26, 2024 01:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant