Who is planning on trying out Mint 18 Sarah? I'm thinking about installing from scratch over my SSD Mint 17.3 to get a new fresh start. I initially installed Mint 17 and then upgraded to 17.2 and 17.3 I have a feeling Mint 18 is based on Ubuntu 16, but is it a Long Term Stable?
Well we are given two choices! Wait a month and let Mint 17.3 upgrade itself to 18... Or What the hey! Let's just start over and see how much I can remember! So How good is is 18? Well I like it just so very much!! ":O} We are short on backgrounds, lacking the previous Mint's background pics. But the new ones are nice! Here's something new and possibly wonderful! Who can say what and how effective Intel Microcode might be...No really! Who can tell me what I've just gained in Mint 18 thanks to Intel and their Micro-stuff? Everything looks pretty good....Install was a BITCH!! I really don't know why! I DL'ed as usual and then checked that all was well with my DVD's MD. Then watched my machine screw up in just about every way possible! The disk would not take me to desktop as per usual. Instead it it kept not finding things in a Terminal start. like network manager, there was a list of things not found: but as the night grew long my memory grew shorter. After several hours, despite the disk reporting itself to be well and good... I downloaded and burnt another copy.... Don't know why, but this time things went much better. (Much more Minty this time!) I'm about 85% installed now and adding my stuff as I realize it's missing... Doing Steam and POL now... A background to howl over!! As there is so much to comment upon, I'll leave this here. However I'd be glad to check on anything anyone might what to know more about. Other than the Intel Microcode There isn't a ton of things that jump out at you. Just the usual solid, stable Mint...so far! ":O} But as we all know. Mint don't do upgrades for nothing...So I'm open to change! ":O}
Awesome Daniel~! I'm glad you gave 18 a whirl... I have it on my flash drive and plan to upgrade soon. I have too many "save games" i need to backup in order for it to run happily. Not sure why the first DVD was giving you a headache? Maybe something corrupted during the burn? I stick with my flash drives because I can use them over and over again.
I may be hitting you up for flash drive instructions soon. I still have a dozen or so DVD's I can burn, then as you say, USB thumb drives are reusable! ":O} Do not fear the Mint! She brings blessings as yet un-thought of!! ":O}
Actually it is super duper easy: Download .iso insert flash drive Right-click .iso Select "Make bootable USB Stick" Select flash drive on right side Begin Thats it. Mint will automatically format the flash drive, make it bootable and install the live disk Linux on it. Mint makes everything super easy! Just don't select the wrong drive letter (partition) Otherwise you might format the wrong drive
Yeah! I love Mint so much that I'm using it at work as well. I hardly touch Windows until I have to. Linux will always be a fun experience
I'm giving Mint 18 a miss, and possibly Mint 19 as well, until either systemd grows up, or gets thrown out as a bad idea...
I recently picked up 4qty 1 Terabyte drives and installed them on 4 computers. Installed Mint 18 with Win8 (dual-boot) and all is going well so far. I'm going to install all of my Steam & PlayOnLinux LAN games on one machine. Share the .PlayOnLinux folder and then copy it to the other machine. This should reduce a lot of work because the virtual drives will be intact due to the exact same username on each machine. So I shouldn't have to re-install all the games on each machine. I have experienced a few random crashes, but I'll ignore them for now.
The Intel Microcode driver allows Intel to push updates to the CPU microcode out via the Linux kernel, rather than requiring you to update your BIOS ( the more common and typical way of getting microcode updates). It SHOULD be safe to leave on. A little background: Way back in the stone age, CPUs were 'hard coded' meaning that the logic for each CPU instruction was actually etched into the silicon. This meant that making a change to an instruction (e.g. in the case of a bug that caused errors in certain floating point math functions) would require changing the actual manufacturing process of the CPU in order to correct, and all existing hardware would remain defective because there was no way to correct it. In addition, CPU designs were getting ever more complex, to the point it was becoming extremely difficult to model and simulate new CPU designs accurately. This meant that it became ever more difficult to predict with any accuracy how a given CPU would perform under arbitrary conditions, as well as making it increasingly difficult to debug problems Finally, the rise of RISC (Reduced Instruction Set Computing) brought to the fore a school of design thought which suggested that, rather that ever more complex instructions (as embodied by CISC or Complex Instruction Set Computing, used by Intel, Motorola, and most other CPU manufacturers of the time), the key to increased performance lay in executing much simpler instructions, and executing them much more rapidly. All of these things combined to push microcode into mainstream CPU production. Microcode was actually in use for quite a long time (the venerable 6502, for example, used microcode) in the CPU industry, but generally in the form of either ROMs or OTPLAs (one-time programmable logic arrays). Unfortunately, these methods still meant that whatever microcode was programmed into the CPU at the factory was what it would have for the rest of its life; bugs still couldn't be fixed in the field. Eventually, a very small amount of onboard memory (typically SRAM) began to be included for the specific purpose of allowing updates to the microcode to be uploaded to the CPU to correct logic defects, and thus the microcode update was born. So, what exactly is microcode? Well, put succinctly, it is the program the actual CPU hardware runs. "Wait" I hear you ask, "isn't the program what I load whenever I run my game?" Well, yes and no. Yes, your game is an application, written in some programming language and compiled to the standardized machine code the CPU architecture understands. The microcode, however, is what handles the decoding and execution of those machine code instructions. The microcode is targeted to the specific hardware of the CPU, and so is even MORE hardware-centric than the machine code that applications get compiled to (indeed, it is so tied to the hardware of the CPU that different revisions of what is otherwise the SAME CPU can run incompatible microcode). So, your system is running application code, compiled to machine code, decoded by microcode, and in some cases, processed by NANOcode. All so that the CPUs we use can continue to evolve in complexity, capability, and performance, at the levels we expect, while still being designed by mere mortals.
I'm not sure how we got on the CPU subject, but I'm continuously fascinated by how a CPU works. I was told an introduction to Processors can be a really cool class to take. Specially if you want to get into programming. It seems that there would eventually be an easier way to process code instead of multiple processes. You would think one method of compiling and decoding would be faster.... Does the ARM processors work in the same fashion as you have explained?
Well yes, I've come to the same conclusion, but that doesn't mean I have to use it in its present condition. I'd prefer to sit on the sidelines and watch to see if it proves itself better for my purposes or not.
Knowing HOW and WHY things work can help when it comes to debugging software. For writing, not so much. Sadly, most programmers seem to think that debugging is someone else's problem. That's the difference between a programmer or developer and an ENGINEER. From a pure performance standpoint, it's true that you pay a performance penalty. This is the same discussion as assembly vs. C vs. Java vs. Perl vs. PHP vs. Ruby; how much are you willing to sacrifice in the way of performance in order to make your development easier? I'm quite certain that a CPU that was designed from the ground up, with every gate and wire and via carefully placed and calculated for optimum usage would be able to reach power and performance thresholds current CPUs can only dream of. I'm also quite certain it would be at least a decade for each version, much less for different generations. And fixing bugs that arise in the field would be orders of magnitude more difficult, if it were possible at all. EVERYTHING, from the lowliest lifeforms to the mightiest achievements of mankind, works on the basis of modules; design concepts that are isolated in themselves and which can then be replicated, tweaked slightly, and replicated some more. This is the only way in which anything of significant complexity can be constructed. Think of it this way: Two objects, A and B. Each talks to the other. Two connections. Something goes wrong, easy to figure out where and fix. Now three objects, A, B, C. Each talks to all of the others. Now you've got six connections; 3 times the complexity. Simple to troubleshoot right? If A is talking to C and C starts acting strange, C must be the problem? But what if C started acting strange because B did something stupid? Now lets go to four objects, A-D. Now we've got twelve connections. A starts acting weird and crashes. Did A crash because of a bug? First glance says 'of course'. But, when a program is handed unreasonable input, sometimes the safest thing to do is to halt the system; to crash. So, what caused A to crash, was it input from B, C, or D? Worse, was it input from B that caused D to misbehave, which then caused A to crash. Now expand that to a hundred objects, or a thousand. And we haven't even covered re-entrant situations, where A calls to B, which calls back to A, which calls to C, which calls D, which then calls B again. Being able to control and define the interactions between objects is the only way to successfully build complex systems in any kind of reasonable timeframe and with any level of faith in any kind of reasonable level of reliability. AFAIK, pretty much ALL CISC architectures use microcode. The ARM processors are different because they are RISC. They don't use microcode in the traditional sense (generally no microcode ROM), but the DO use a hardware description language (HDL) to describe the instruction set. This gets compiled down to a set of instruction decode blocks which are then implemented as gates in hardware. So the same modular concept exists, but it's all translated to the hardware level. ARM can do this because the instruction set is highly orthogonal and fairly simple in its base complexity (the whole point of RISC). It's also worth noting that in the case of ARM, the vast majority of designs are implemented by hardware manufacturers who can use VHDL to program FPGAs to simulate the final hardware, and run the intended software on it. As such, in most cases the target system is well defined and understood, making optimizations possible that wouldn't be possible under more general-use conditions. (Note I said the vast majority of DESIGNS; ARM CPUs are present in the majority of cell phones, which means the majority of ARM CPUs are used by those ITEMS, but cell phones account for a tiny number of ARM-based DESIGNS, compared to TVs, refrigerators, microwaves, smart thermostats, medical devices, cars, etc.)
We are short on backgrounds, lacking the previous Mint's background pics. But the new ones are nice! Here's something new and possibly wonderful! Who can say what and how effective Intel Microcode might be...No really! Who can tell me what I've just gained in Mint 18 thanks to Intel and their Micro-stuff?
Wow, so detailed as usual Gizmo. I'm always impressed with your knowledge and willingness to share it. The ability to program processors is definitely a difficult thing. Which is why I would like to take a class on it some day. I know the really technical stuff would be pretty boring, but knowing the application when actually coding something is totally valuable.