(RETURN TO MAIN PAGE)

There’s a lot of technical talk in this document. If ithis scares you, click the link above and go hug a cat or something.

RAM Drives

Okay, so your old SATA hard drives are big and slow. It takes a long time to write and read data to those.

Then there’s these new solid state drives (SSD). Lots of chips on those. They’re fast. Newer than the SSD are NVMe. And there’s even faster storage options than those.

But you know what’s really fast? Your RAM memory.

If only there was a way to store your Second Life cache on that.

How big should it be?

(Phrasing!)

According to Firestorm: “Set this is high as possible, based on the free space on your hard drive.”

Well, that’s… um… er… not quite helpful. I mean, the dialog boxes let you set up to 20GB of cache in Firestorm, which means you’d need 20GB of drive space for it. But unless your system has a huge amount of memory and you rarely need it for anything but Second Life, that’s really overkill.

But what I do know is that a RAMdrive will make it faster!

Now, if you’re making a ramdrive for your Windows and Chrome temp files, the answer is “Big enough to let drivers and program installers run.” Because nVidia’s installer takes way more than a 1GB temp file… well… you’ll see in a bit why this is important.

Set up a RAM drive

A RAM drive allows you to use system memory like a hard drive.

It’s extremely fast. But it vanishes when you restart your system. Thankfully, you can save it or load it at system restart.

The one I use is SoftPerfect RAM Disk. But there’s others out there, so check their reputations and support pages.

Because I have so much memory on my system (At last check, it has 64GB of memory. It had 128GB of RAM in it, but I decided I didn’t need all of that, and spacing out the memory sticks allow a lot better cooling. Still, that’s a lot. Probably too much.), I created an 10GB RAM Drive for my Second Life caches and an 8GB RAM Drive for temporarily saving photos and videos. Because Second Life can save photos and videos faster, my system doesn’t stall as long between snapshots and captures.

  1. In the menu bar, click Image.
  2. Then click Create Image.
  3. Set a directory location (Ensure it won’t sync with OneDrive or other backup systems which will clog your network traffic).
  4. Name it something sensible.
  5. Give it a size in MB.
  6. Change type to exFAT (exFAT is faster than NTFS and FAT for this purpose, and Richard Simmons would agree).
  7. Click OK
  8. Click the plus button.
  9. Select the drive image’s location and name.
  10. Select a drive letter.
  11. Select the Save drive contents to image checkbox.
  12. Click OK.

The R drive stores photos, the S drive is for Second Life cache, and the T drive is for Windows and Chrome Web Browser temp files. The T drive does not save to an image file because that crap needs to die every time I reboot.

Let’s set the Firesstorm cache location…

  1. Open Preferences (Control-P).
  2. Click the Network & Files tab, then click the Directories tab.
  3. For Cache location, click Set.
  4. Select the drive you set for the cache and click OK.
  5. Click OK.
  6. Restart your viewer.

For what each of the caches are, read Firestorm’s explanation:

Texture Cache Size: Determines the maximum texture cache size. Set this is high as possible, based on the free space on your hard drive.

Asset Cache Size: Determines the maximum cache size for the rest of the assets. (Assets include animations, sounds, and mesh data. Textures are also assets but have their own slider control.) Set this is high as possible, based on the free space on your hard drive.

Second Life tends to use a lot of memory for textures, but then we use memory for meshes, mesh clothes, etc. Animations don’t use much space, and we don’t really need to store them between sessions… we’ll cache them right before each act. Sounds are not a priority, since we use a music stream. So textures should have priority, I think… I went with a 70/30 mix, but could go with more on textures and less on other assets if I had to choose.

Of course there’s a catch…

Because your system is loading those two persistent cache files into memory when you boot and saving them when you shut down (or reset), you will experience a longer boot up and shut down period.

However, SSD drives tend to be faster at Reads than Writes, so the delay will be less with the boot up than the shut down (or reset). Which doesn’t matter if you’re shutting down for the night, but may add a few seconds to your reboot times.

Doing reads and writes from the RAM will save some wear-and-tear on the SSD, but then reading and writing the data for the RAMdrive will add some wear-and-tear for that persistent cache file.

Once again, you could go crazy with a 20GB cache and 10GB for each, but if you’re using persistent storage, that’s up to 20GB to save with every shutdown and up to 20GB to load with every startup. So in the end, that will punish your drive with 40GB each cycle, 280GB of transfer in a week (assuming one cycle a day), and then after a year… well… welcome to SSD Meltdown Town?

Your drive’s documentation should tell you what the drive’s lifespan and tolerance is, and if it comes with an accessory/app, it should give you ample warning of failures. (Mine is a Samsung 970 Pro 512GB that has an app which says my drive is still good as of July 2022, but the reads and writes are climbing over time. I’ll switch to a 980 Pro when it gives me warnings.)

Hence why I keep copies of applications, data, and other things as well as make backups. I also have a disaster recovery plan which I test once a year with a full rebuild.

Cooling

Everything in your computer generates heat when it runs. It generates even more heat when it runs more.

Be sure that your memory has adequate airflow. If it has radiators on them, they’re unobstructed. And if you have liquid cooling on your memory, well, you’re just rolling in cash, aren’t you.

The LED lighting on memory is fancy and silly, and shouldn’t add to heat much, but it’s also just another thing that can fail on your system, why add more points of failure?

Finally, having more memory is often a good thing, but remember that it also draws more power, and if you pack every slot, you’re reducing the airflow and concentrating a lot of heat. Sometimes, it’s better to get a motherboard with twice the slots you need to space out the memory sticks, or when it’s time to upgrade your memory, don’t just get more sticks of what you have, but replace them with sticks with double the memory to avoid airflow issues.

Windows Temp Files

If you create a ramdrive for Windows and Chrome temp files, make sure it’s big enough.

The nVidia driver installer needs way more than 1GB of temp space to run, so yeah, my 1GB ramdrive for Temp failed miserably the next time I installed the driver and BOOM.

See? I do dumb stuff, too. But I don’t make mistakes during shows, dammit.

Is it reliable?

It’s a computer. Things break.

So, when I am done capturing photos and video, I move the files to my old hard drive (D:) for long-term storage. And I make backups once a week to an external drive array.

Also, make a reminder to check the SoftPerfect RAM Disk page once a month or so, or have your Norton or whatever software checker check it for updates. They release bug fixes and improvements on an irregular basis. (I have a Siri reminder to check all my software once a week.

Memory Speed

(This is somewhat related to RAMDrives.)

Yes, your system memory is faster than your SATA and M2 and other SSD hard drives. But is it as fast as it should be?

A lot of memory manufacturers advertise speeds like 3200 and 3600 and 4000 and such. And the memory has been tested and rated to handle that speed of data reliably. But when you put them in a motherboard that’s supposed to be able to handle that speed, it actually slows the memory down to a speed like 2400.

XMP (Extreme Memory Profiles) was created by Intel and now has been adopted by a lot of manufacturers, and it allows you to set a profile in your motherboard’s BIOS to run your memory at the rated speed. Some of them call it memory overclocking, but it’s not really overclocking if it’s running at the speed on the label, is it?

When you go into the setup for your motherboard’s BIOS (F2 on boot usually) or look at some diagnostic within Windows, you’ll see the speed of your memory. Compare it to what’s on the package. If it’s less than what it should be, look for XMP in the motherboard settings and you can turn it on.

Is it safe? Well, it’s supposed to be safe, but with everything there is risk. If your memory is old or cheap or there’s something wrong with it or you’re already running really hot, or if the CPU has a much slower memory cache and clock speed, yeah, something might glitch and lock up and slow down. But for the most part, things will be okay, and you’ll get the full speed of the memory you paid for.

The newer the motherboard (with an updated BIOS) and the newer the memory, the newer the generation of XMP they will use, so the more reliable it will be.

On top of that, there’s PCI Gen 3, PCI Gen 4, PCI Gen 5… and all kinds of other detail stuff. I mean, as of August 2022 I’m only running PCI Gen 3 and an Intel 10940X overclocked and watercooled, which makes for a slower transfer rate across the backplane, and the video card doesn’t even run full speed because it’s Gen 4, so… yeah, whatever you do, there’s always some bottleneck that some store will try to sell you an upgrade for. It never ends. Assume your hotrod system today is tomorrow’s slowpoke, and one of your friends will take your king-of-the-hill crown eventually. Just be happy until it all melts down.

And I suppose it goes without saying that keeping your drivers updated to stable versions/releases is usually a good thing, too (Good developers and QA testers tend to introduce more fixes and patches and features than bugs).

(RETURN TO MAIN PAGE)