Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not needed on recent Windows? #199

Open
gaborcsardi opened this issue Mar 18, 2025 · 3 comments
Open

Not needed on recent Windows? #199

gaborcsardi opened this issue Mar 18, 2025 · 3 comments

Comments

@gaborcsardi
Copy link
Member

Hi Gabor,

a number of packages are failing their checks on my system, claiming
that I don't have tools necessary to compile a package. This includes
checkhelper. Compilation of vol2birdR fails because it chooses incorrect
HDF5 flags, for the same underlying reason.

I indeed do have such tools. The problem is that pkgbuild assumes that
one needs to have Rtools installed on Windows from the .exe installer.
But Rtools can also be installed from a tarball, with say a separate
Msys2 installation for build tools: this is documented as a valid,
normal way to install Rtools. It is used e.g. in some github actions. So
the detection is wrong. My work-around is to create an empty
c:\rtools43\usr\bin directory, but it really shouldn't be needed.

In addition to that, one even doesn't need to have Rtools to build
packages. I think that it is not good to constrain R packages to Rtools.
This is normally not needed, makes testing and development of Rtools
harder, and would make it harder to use any custom/experimental
toolchains/tools (such as e.g. Msys2). If packages really need to depend
on some specific details (and only very few should need to) of the build
system, they should rather check for specific things in the build
system, in the same spirit as say autoconfig or cmake does. See that
e.g. the sources of R itself try to be prepared for this, allowing
"custom" tools, etc.

I don't think one should ever ask whether specifically Rtools is
installed, or which version (rather than say whether a C compiler,
Fortran compiler, or whether a simple R package can be built from
source). After all, I assume this is how you do it on Unix?

The current versions of Rtools have file ".version", which you can find
relative to $(R_TOOLS_SOFT), regardless of whether Rtools has been
installed from exe or tarball, and it gives you also the exact version -
e.g. Rtools 4.3 will be updated soon and a number of packages in it as
well. But this really shouldn't be used in packages to conditionalize on.

Best
Tomas

@gaborcsardi
Copy link
Member Author

Dear Gabor,

I've just spent another two hours (in addition to Uwe spending a number
of hours). This time we were testing Rtools45 and looking at FLASHMM and
rankrate, why they are failing to untar some files. And, it is because
pkgbuild tries to compute which Rtools belongs to the present version of
R and puts that Rtools on PATH. I did report this to you in June 2023
already, and still since have ran several times into this problem in
different places when debugging packages using pkgbuild.

R packages must not explicitly try to compute which Rtools is used and
where it might be installed. This is something external to R.

This time, technically, the problem was that Rtools44 build tools were
put on PATH via pkgbuild::withr_with_path from devtools. Which happened
when we were testing R-devel with Rtools45, running from a shell that
indeed had "tar", but that version of "tar" used an incompatible Cygwin
runtime, because, indeed, the Cygwin runtime in Msys2 is updated time to
time.

But the key point is: no package should ever be computing which Rtools
may be used with R, and this is what I had reported to you in June 2023.
Rtools is only put on PATH either automatically by an installer version
of R (as documented) or by the end user explicitly, typically when the
user is using a toolchain bundle of Rtools (as documented), or when the
user is testing R/packages with a different toolchain (such as Msys2
toolchains with recent LLVM or GCC), or indeed when testing development
versions of Rtools.

I've ran into this problem in pkgbuild already several times, when other
packages ended up deciding on which libraries to link based on which
version of Rtools pkgbuild thought was in use. This was a really
unhelpful practice that added maintenance cost to the package authors
(as well as to me).

Is there a technical reason why you are doing this? Perhaps we could try
to find a different solution?

Best
Tomas

@gaborcsardi gaborcsardi mentioned this issue Mar 18, 2025
Closed
36 tasks
@gaborcsardi
Copy link
Member Author

On R 4.2.x (4.2.0, probably?) and later R puts the correct Rtools on the PATH, even if it is not installed. So starting from that version

  • we don't need to put anything on the PATH,
  • we can start searching for "Rtools" (or any appropriate compiler) on the PATH. And really just looking at the PATH as it is, and not trying to be smart like finding ls and then adjusting the PATH from there. We could call R CMD config, that works if sh and make are on the PATH. This takes about 500ms on my laptop. In pkgdepends/pak, we don't even need that, we can just try the installation, like on Unix.

This has the side effect of discovering Rtools42 and later as "custom", we can possibly do better there. But if not, that's not the end of the world, either.

@gaborcsardi
Copy link
Member Author

From R 4.3.0 pkgbuild runs the compiler to test for build tools. This is much slower than finding rtools, so I'll keep this issue open, and will improve the behavior in the future. Or not, since pak does not check for build tools on recent windows any more, anyway, and that's the main use case.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant