Can't there be some sort of global binary cache server?

I like the fact that there are lots of interesting things in AUR, but the problem is that it seems to build the packages on my computer. For small packages, it is fine, but sometimes it seems to take tens of minutes to build one package with lots of fan noise. And if I update the package later, it will repeat that process.

What I wonder is, can’t the binary be cached and shared amongst all Manjaro users? I do not know much, but if the binary results are the same on the same CPU architecture and 32/64bit version, won’t a lot of users are building the same binaries? (I mean, most users are probably using x86, 64bit) Then, can’t the first person who compiled the package for that setting automatically upload the binary to the AUR server and all other subsequent users can have the option to just download that binary instead of rebuilding it?

That’s a mirror.

That’ll cost extra bandwidth, not good if you’re using a metered connection.

And besides,

aren’t that long. Not really.

I once, accidentally, tried to install Chromium by source. Now I have a 6-Core, 12-Thread i7 with 16GB RAM. Not the most powerful computer no, but certainly more powerful than a LOT of users’. After ±8 Hours I stopped the compiling.

No, I’m not complaining, I’m merely offering a comparison. “Tens of minutes” aren’t that long. There are so many applications on the Manjaro mirrors that there is hardly ever a need for the AUR.

And, at least I’m pretty sure, that if you’re going to use the AUR, you know this can be different.

2 Likes

Just to add to @Mirdarthos 's point there are already thousands of precompiled packages available on the AUR → AUR (en) - Packages. For example if you don’t want to compile visual-studio-code you can instead install visual-studio-code-bin.

Also most such packages which will take time compiling if installed through AUR are already present pre-complied on the manjaro repos.

Interesting. 16GB RAM could be a minor bottleneck, but I’d be more interested in how many cores -J(x) you were using at the time.

Have you ever (purposeful, or just for giggles) built a kernel?

Way back when. When I first tried Gentoo. My PC at that time was a 700MHz AMD…

Edit:

The default. It was before I knew of/learnt of makeflags. BUT according to KDE all my cores and threads were above 90% usage.

1 Like

I second this. :point_up_2:

Use the -bin versions of PKGBUILDs in the AUR whenever possible.

  • less to download
  • smaller cache footprint
  • fewer dependencies required
  • builds and install much faster (no compiling from source)

That’s technically what the Community repository is. PKGBUILDs previously only available on the AUR become popular enough that they are “promoted” to the official Community repository, in which it is adopted and updated by a proper Arch Linux package maintainer.

Only if we import an AUR package into our repos. We have several, actually.

The AUR isn’t a package repository, it’s a collection of build scripts.

2 Likes

While AUR can be very useful - it may also create other issues because dependening packages is assumed to be equal to the Arch Linux repo.

Furthermore the scripts has been created with Arch Linux in mind and you need to know how some scripts have dependencies which must be resolved manually.

1 Like

Adding to @ishaan2479’s post, you could use an Unofficial user repository (ArchWiki) for precompiled packages. Chaotic-AUR* is not part of that list, as it is connected to Garuda Linux, but I’m using the precompiled fsearch-git, qbittorrent-qt5, jamesdsp-git and other packages from that repo.

* Be careful with this repo, it was removed for the following reason: “Specific to a different distribution (Garuda Linux), replaces files in core Arch packages such as filesystem.” (link). This happens for certain packages (see reddit link), not the ones I use.

There are also Flatpak and Snap.