- boot something like Hirenz via USB
- mount the device Windows MBR maps as C:
- extract the rarball onto the mount point
- reboot
- learn not to dumbass, dumbass
You made up all of those words, except dumbass I suppose. I’ve heard that one before.
Does winrar store ntfs attributes and security settings? If not, then simply uncompressing the files to their original location may not result in a working OS, especially if the registry wasn’t backed up because winrar couldn’t access it
I think what’s funniest to me is he was very close to recreating a real windows feature. You actually can “compress files to save space” under options, and apply it to all child files and folders. This is not the way to do it though lmao
Of course any disk space you save by doing that is exchanged for performance and time.
ok but that negates any storage saved tho
I know I’m stating the obvious but I need to say it. It’s so funny to me that even if he succeeded in what he wants to do (unpack from the BIOS?) He’d have the space taken only when the computer was booted, so effectively he’d be unable to use all the freed space.
no clue why u would he the person. but beyond that, the person could mean extract in ram (which equally stupid and most likely not viable but eh)
Didn’t realise there’s a windows version of sudo rm - rf /
its called windows update
An OS in the 2020s should not use 40+ GB. Period.
true
Making a rar archive of your windows folder doesn’t leave your OS unbootable…
These two would beg to differ:
C:\Windows\EFI\bootmgr.efi
C:\Windows\System32\ntoskrnl.exe
The first is generally copied to the EFI partition but the second isn’t. The second one contains all Windows Executive services so good luck booting without it. The windows folder also contains explorer, which is need if you want a graphical desktop, and most drivers.
Who said anything about being without them? It was only stated the windows folder was added to a RAR archive. Compressing files into an archive doesn’t remove them, it just creates compressed copies.
and he had space on a “completely full” disk to save an archive of a 40GB windows folder.
Wait… Do you guys think someone would really do that? Go onto the internet and tell lies?
I’ll assume it’s a typo. for now.
Rookie mistake, they should have compressed the rar file with 7zip to save even more space.
I compress my files with ROT13
I compress everything with md5
I’m still amazed at how much 7z can compress. The other day I extracted a 1.5 GB file, and it ended up being 4.7 GB.
That sounds like it was a DVD image that was mostly empty space, so any compression tool would have been able to save space. But yes, 7z is still impressive.
wrong, should’ve used 7z or zip under LZMA2 for best compression in this case
This was something I actually used to do with my one gigabyte hard drive in my Windows 95 machine. Though turning on drive compression through Windows would cause the machine to be soooo fucking sloooooow.
i think u mean using the built-in compression. this mf using external software to create an out of boot file windows.rar and deleting the original. which can never work
question, is there a way to use compression on the whole system, so files are decompressed on the fly.
Exchange overall storage space for speed?
there are some filesysstems that do exactly this, I use btrfs with compression enabled for example.
yeah but it doesn’t make much sense
what’s actually taking up disk space are images and video, and those are already compressed. you cannot compress already compressed data again, it just won’t reduce the filesize. so there’s no point compressing everything, because where it matters, it doesn’t do anything, and where it doesn’t matter, it makes things slow and adds complexity, which is an additional failure point.
Some filesystems (like btrfs and ZFS) support comression on the filesystem level, where each block is compressed with some algorithm automatically, completely transparently to applications.
Most modern CPUs are fast enough at the light levels of compression that is used, that usually its also faster, because you read less data, and the read + decompress time is often lower than how long it would have taken to just read more data; though of course that depends on what data exactly, but overal its often faster (though usually its not by a very significant amount) for most average uses.
In short, yes. Windows has a checkbox in the settings somewhere (I think the partition manager? I can’t remember now) to enable compression on a given partition, so you can in effect enable filesystem compression on the C: drive. Through the command line you can also compress individual directories with different compression algorithms and I had incredible luck compressing game files with LZX compression with some games compressing down by 3 or 4 times (these were notably games with 100s of gigabytes of user-generated assets. More normal games only saw around a 20-40% reduction in storage space usage, which isn’t bad at all)
Outside of Windows there’s popular filesystems like zfs and btrfs which support filesystem compression and general encourage it by default because the speed of (de-)compression after reading from disk is almost always faster than just reading the uncompressed data from the disk directly
It’s existed since DOS. It doesn’t work all that well. But it does work if you’re desperate for space.
https://en.wikipedia.org/wiki/DriveSpace
https://www.tomshardware.com/reviews/ssd-ntfs-compression,3073-11.html
I know the opposite can be done because I did it just recently.
I have a nearing 10 year old set up from when it still made sense to have a 200gb SSD with a 2tb HDD for games. This hard drive is absolutely struggling with these massive games like Baldur’s Gate 3 and Cyberpunk (and Baldur’s Gate 3 has the annoying habit of not waiting for assets to finish loading before playing a cutscene).
I used this thing called bcache to take a 100gb partition of my SSD to automatically cache the most frequented files from the HDD. Even though Baldur’s Gate 3 is 120gbs (which I don’t think it needs to be, I think it’s poorly optimized) it was still enough to mostly get rid of any loading issues.
To make this relevant to your question, you could get a massive cheap but slow hard drive or even an external drive and use something like bcache to get the performance of your internal SSD.







