• 1 Post
  • 100 Comments
Joined 8 months ago
cake
Cake day: January 17th, 2024

help-circle





  • glibc’s malloc increases the stacksize of threads depending on the number of cpu cores you have. The JVM might spawn a shitload of threads. That can increase the memory usage outside of the JVMs heap considerably. You could try to run the jvm with tcmalloc (which will replace malloc calls for the spawned process). Also different JVMs bundle different memory allocators. I think Zulu could also improve the situation out of the box. tcmalloc might still help additionally.


  • I ran Arch on a convertible laptop around 2006-2010. Most notes I did using OpenOffice Writer, with hotkeys to quickly add formulas. Drawings were done with the pen. Homework (where speed didn’t matter as much but where I wanted high quality) were done in ConTeXt.

    Programming was done in FreePascal using Lazarus IDE or Java using Netbeans IDE, depending on the course and my personal preference.

    I think I had no complaints from anyone. Quite the contrary, one professor even gifted me a book as a thanks for the high quality typesetting in my homeworks, since most students didn’t give a shit and had no fucking clue how to really use their beloved MS Word.







  • I am not a big fan of this, because you then rely on the scanner manufacturer to produce good quality results.

    I scan everything using VueScan and that has a special mode for text documents. A single page with OCR ends up being about 25kb as PDF. It removes folding edges, sharpens the letters, etc.

    If that software gets new features, my scanning experience improves automatically, even though I still use the same scanner for 10 years now.

    With relying on the firmware, I would have long ago stopped getting updates and I either was ok with the results or I could throw away the whole device.

    Just as people here recomment to separate printing from scanning, I recomment to separate the hardware and software.