Uefi doesn't recognize bootloader after disconnecting and reconnecting the ssd

I have Manjaro installed on a SSD, and for gaming (when not possible in Linux) I have Windows in a HDD.

Today I wanted to install windows 8 (7 hasn’t age well, in my opinion) so I disconnected the SSD so that windows install doesn’t mess with it.

The USB was bad written (weird, before formatting the win7 partition I used the official media installer to create the win8 installer) so I had to connect the SSD again. To my surprise, the boot option wouldn’t boot, I keep choosing it but it just didn’t boot. I tried reseting the UEFI settings but nothing. After messing with the settings, I ended up disabling Fast Boot, CSM and Secure Boot completly, also disconnected all the drives, and connected them again. Now there weren’t any boot options, just Windows Boot Manager (although no windows partition anywhere, I think it’s a hardcoded option that points to a detected one).

I decided to boot live Manjaro to check and fix the grub. I don’t know how can I know if the grub is ok, but I did re-installed it and now it was recognized as a boot option and I was able to boot Manjaro. Until I disconnected it and connected again. Is this the normal behavior? To disconnect a disk and the next time the bootloader is not recognized by UEFI?

Another weird thing is that I can’t launch the EFI shell, it probably is because it’s not detecting the filesystem (SSD), but the error message says that secure boot is enabled… I can clearly see in the secure boot menu that the state is disabled and the platform key also is.

Something to do with UEFI boot configuration changes/resets in (cold) double boot?

TL;DR; In the boot order/override menu, the only option I can see is a hardcoded one (with the SSD connected and detected in the sata settings). UEFI stops detecting the bootloader in the SSD by just disconnecting it and connecting it again…

EDIT: In case is worth mentioning, my case is kinda special (aerocool strike-x air): the drive bays are removable and the drives are not directly connected to the mobo, instead, they’re connected (directly, without cables) to a PCB and this is what is connected to the mobo using sata cables. Additionally I don’t turn off the PSU to connect/disconnect the drives. Can any of this be the cause of the problem?

Ok, not on topic, I realize.

But You are on a fools errand here. Support for windows 8 has been dropped. Its Dead Jim! Everybody in the known univers has run away from Windows 8 like their hair was on fire.

Your only upgrade is to Windows 10. (And I don’t recommend that either - but that’s another story).

Win 7 is probably the best version of windows ever released. Your best bet is save your data, and nuke the machine, reformat, and re-install Windows 7. You will be amazed how much better it works.

I see you already formatted. So there’s your answer. You already nuked it from orbit, and you got nothing, now. Start over. You’ve got a blank slate. Take advantage of it.

And always de-power a drive properly before yanking it out. Surely the aerocool thingie provided some method for that, a key, a switch or something.

Look here and here.


sudo grub-install --target=x86_64-efi --efi-directory=/boot/efi --bootloader-id=manjaro --recheck --debug --removable

make sure “–removable” is in above command.
Note however, after pulling out the external, you must boot manjaro using the boot set key (F8 ~ F12) when you reconnect the drive.

@jsamyth

You are mistaken, Windows 8 (updated ofc) EOL is in 2023. Until 2018 for mainstream support and until 2023 for extended support. If you wanna get technical, then yes, Windows 8 (not updated) is no longer supported and not even Windows 8.1 leaving the lastest upgrade, Windows 8.1 Update 1 (the “official name”) being the supported version. But that’s kinda obvious, unless your computer doesn’t support 8.1.

And again :grin: Windows XP is still the best version ever released, but it’s too old leaving Windows 7 being the recommended option. But then, Windows 8 is better (in my opinion) due to it’s performance improvements (RAM management, etc), a better choice overall, except for the ugly start menu and the missing aero, the desktop UI is actually a practical improvement and not that bad actually, the Task manager is one of the reasons (after the performance improvements) I wanna change.

What do you mean by de-powering the drives? Do you mean turning off the PSU? Waiting a few seconds before pulling them? The bays are just mechanical, nothing that prevents removing them with the computer on, no thingie nor locker, nothing (image).

@gohlip

I don’t know if that’s my case, I’ve never had any problems with Windows messing with other bootloaders (except the first time I started dual booting). I’ve been dual booting without a problem, the problem is removing the drive and UEFI stops recognizing the bootloader (after connecting it again ofc).

The first link doesn’t apply to my case because I formatted the Windows partition (HDD) before disconnecting the SSD, and I connected it again after realizing Windows 8 wasn’t written on the USB drive.

So, the --removable flag tells me that it is normal for the grub to stop working if the drive is disconnected? That doesn’t make any sense to me, why would the devs code it like that? Well, it could make sense in this case, disconnecting it while the PSU is still on, the mobo has power and it can detect the removal.

I don’t know if it’s a good idea to use that flag for an internal SSD. What I’m going to do is install Windows and then reinstall the grub again.

Anyways, I really wanna know why UEFI stops recognizing it (or maybe is the grub breaking?). I’m gonna do some tests:

  1. Switch off the PSU, switch it back on and turn on the computer. Check if everything’s ok.
  2. Switch off the PSU, disconnect the SSD, switch the PSU back on and turn on the computer. Turn off the computer, switch off the PSU, connect the SSD, switch the PSU back on and turn on the computer. Check if everything’s ok.

With the HDD connected and disconnected. With this I check if the Windows bootloader has any effect. Not booting in any case.

Very correct. It is not needed as an internal device is always connected.
But in your special case with your removable drive bays and POST, it may act as though it disconnect and reconnect the drives. If that is not the case and all drives are always connected at all times, then it does not apply to you. But as you’ve seen in the links (and in my case where I have external drives), I have to add that ‘–removable’ flag. You can easily try it out with the ‘–removable’ flag and redo without if it does not apply.

Just for you to consider if your system is doing it that way. You should know your system better and if you are sure it doesn’t apply, then fine. Let us know if you finally find out why your system is rebooting after start up.

ps: have you changed the cmos battery?

My computer case may be special but this case scenario or setup is no different in any way from your computer or anyone else’s:

  • The removable drive bays are just that, bays to put the drive. The drives are still connected to SATA ports so they act as normal internal drives and they’re never removed (except for reinstalling Windows, or some cleaning).

  • Power-on self-test is a base feature that every mobo has, it is in no way something special.

I hope it’s clear now. Sorry my english may not be as good as I think it is :stuck_out_tongue_closed_eyes:

Definitely.

I’m thinking that maybe the firmware was infected with malware, so I re-flashed it with the asus flash tool for windows. Do you know if doing that removes malware? Unless it’s installed in another memory, if it was indeed the firmware then by replacing it, it should have worked. The settings weren’t reset. I have yet to see if there are any more reboots after start up after the re-flash.

I’m too lazy for that :sweat_smile: I’m actually waiting to see if the problem’s elsewhere (malware), find a solution or even maybe waiting for it to be fixed by itself :grin:

My problem is that I think my English is better than it is.
Same thing? See, told you so. :grin:

firmware …I re-flashed…Do you know if doing that removes malware?

Rare that firmware gets infected. But reflashing should remove malware.

Back to your OP.

disconnected all the drives, and connected them again.
Now there weren’t any boot options, just Windows Boot Manager

I decided to boot live Manjaro to check and fix the grub. I don’t know how can I know if the grub is ok, but I did re-installed it and now it was recognized as a boot option and I was able to boot Manjaro. Until I disconnected it and connected again. Is this the normal behavior? To disconnect a disk and the next time the bootloader is not recognized by UEFI?

When we disconnect and reconnect drives without ‘–removeable’ flag, that would be ‘normal behaviour’. But we can boot up (like you did) to livecd and fix it or use this or this to get back your boot. I think it’s easier (personal bias possible :slightly_smiling_face: ).

(although no windows partition anywhere, I think it’s a hardcoded option that points to a detected one).

If there is no more windows and we want to get rid of windows bootorder entry (it keeps appearing as do other uninstalled linux OS), we have to remove the directory in /boot/efi. It is not so much a ‘hardcoded’ option but that the directory in /boot/efi/EFI remains.

sudo rm -Rf /boot/efi/EFI/Microsoft
sudo rm -Rf /boot/efi/EFI/<Other Linux>

“efibootmgr -b xxxx -B” will not remove it and the system will regenerate it back even though the OS has been removed.

Cheers.

Ok, so that explains why UEFI does not recognize the bootloader.

Curious, before booting Manjaro (after 1 day in Windows) and reading your last post, I connected the USB with Manjaro to re-install the grub and I noticed a new option that says “Detect EFI bootloaders” and it let me boot Manjaro. So were you the one that added that option? Or was it added thanks to you then?

How come a grub in a USB drive can detect existing grub installations but UEFI can’t? So the grub is not broken then, it’s a UEFI bug?

I didn’t need to remove those entries manually, they were gone the other day after I finished messing with it (reseting, disconnecting, etc).

So I just have to run grub-update or reinstall it again so that UEFI detects it again?

No. It (this feature has been there all along) has nothing to do with me. In fact I copied the ‘stanza’ because it looks interesting, though I have my own methods (as per the links).

In your case and in so many others where the bootloader (or grub.cfg) is never broken in the first place, a simple “sudo grub-install” when booted up in installed OS is all it takes.

This Question last…

Last because it is difficult to answer accurately.
Our install media (17.0.2 and later) uses grub and any grub can boot any OS whether or not that OS has a grub bootloader or no bootloader at all.
UEFI (I take it you mean the bios itself) stores bootorder entries. If a disk is removed and booted up, the efi entries in that removed disk is also removed from that bios. (We can check with ‘efibootmgr’ when booted up without the removed disk). When reinserted, it remains lost. [A removed disk with ‘–removable’ flag when reinserted and selected for boot 'makes the bios recheck for bootable content which a non ‘–removable’ disk does not have].
A bios-legacy disk when reinserted has the ‘stage 1’ bootable ‘content’ in mbr so that the bios can then proceed to look for stage2 ‘content’ in the OS partition. So in this respect, a bios-legacy external is like a ‘–removable’ uefi external device.

Still not technically accurate, but hope you get the picture.

Cheers.

Yes, but technically bios and uefi are different things, both are firmwares (old and new respectively). Anyways lets assume that bios is uefi, i’m also more comfortable with the term:

char bios = 'uefi'; :grin:

(Sorry I can help it, I feel the need to explain the obvious)

Not fully. What I understand is that in a disk partition there’s a bootloader (grub) and the bios detects it and adds it to the boot order. So if the bootloader is in the SSD and it’s actually working, why is the bios not detecting it?

You mean “if a disk is removed and the computer is booted up”?
What do you mean by “the entries in that removed disk”? How are the entries from there removed when the disk is disconnected if the disk is off? Or why are they removed at all? Nothing should be removed from the disk… In fact, nothing has been removed.
I can understand the bios removing them (from it’s memory), that’s actually logic, the disk is no longer present so the bios doesn’t show them anymore, but when the disk is connected again, the bios should detect the bootloader and show the option(s)… Booting automatically the first in the boot order.

Am I talking nonsense? I feel like my logic is messed up.

No, not at all. I too was ‘shocked’ when it happened to me.

Correct, not from the disk
but removed from the (uefi) bios

Precisely.

Unfortunately that’s not the case and I too was disappointed. That’s when I discovered “–removable”. And that itself carries a ‘penalty’. If I use “–removable” in an internal always connected disk, it won’t be listed in the efibootmgr bootorder and I had to select the boot from the ‘boot-setup’ key (F8 ~ F12) just like from an external drive.

Your questions are totally understandable and I guess there’s some constraints in uefi that made the limitations inevitable, but hopefully surmountable in the future.

Cheers.

[Appendum]
Here’s how I circumvent the limitations. I always use my own grub (grub2) and not any OS’s bootloader to boot any OS for a long time now. It is more difficult to achieve in uefi than in bios-legacy, though it is much more troublesome to do it in grub-legacy (grub 1) than in grub 2. Of course, put it in the internal drive. :slightly_smiling_face: It will boot anything, either in internal or external drives, inserted or reinserted - thought you might be interested.

Good night.

1 Like

Thank you so much, I’ll take a look at your grub2 post when I have a moment.

I fixed it for now by running grub-install and update-grub.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.