I highly recommend backblaze.
I’ve been looking at ways to reduce cable clutter, which basically means trying to have cables that are not excessively long. For example, the cable for my Seagate hard drives (regular USB-A type to plug into the computer or hub, but I think it is called type B or micro B on the end that plugs into the drive) is about 4 feet long. In my setup, the drive is only about a foot away from the computer and I have a second Seagate drive plugged into the first one with the same connections - but they are right next to each other. So, lots of unused cable distance. So I was looking for options and almost all cables seems to start at 3 feet. Once in awhile I find something that is 1.5 feet, but few options. Just seems like there would be a market for short cables. I can coil them up, but it still adds to the clutter.
0.5 foot and 1 foot cables at monoprice.com.
$1.57 for usb c to usb c.
FYI - they have other colors besides pink
Trouble is, those are usb-c and I presently have nothing that is usb-c other than the ports on my Mini.
My first external drive is USB 3.0 and plugs into the standard USB 3.0 ports on the mini. But the end going into the Seagate drive is a different design. I think it is called a micro-B.
The second seagate is plugged into the first with the same two different usb connectors. Then I have an old 2.0 drive connected to the second Seagate with yet another double ended connector which I believe is "A-male to “B- male”. Also has two ferrite things on it.
I guess I could look for a shorter cord for going from usb-c to the micro-B on the Seagate and another using standard usb-a to micro-B to connect the two computers as first steps.
Also looking for a shorter HDMI for one of my connections (I have a number of HDMI connections and a couple are pretty short. I found this, but as I research pretty much any cable, I keep finding people talking about things I never even thought of. In the case of HDMI, not only 4k at 60hz, but supporting something about 4:4:4 (I forget what that is, but it is important for text as I recall from past reading).
USB gets complicated to. Ideally would want to support 3.0 and even 3.1 and thunderbolt at max speed, but apparently some also only support limited power which can matter in cases where you may be charging devices.
Here is an HDMI I’m looking at:
The iMac update I had been waiting for a couple months ago finally happened so I could compare the Mini I got (because it already had been updated) to the new iMacs. Prices are based on my military/veterans discount which is better than the education discounts I used to use.
I spent $1349 for the Mini since I went with the base 6-core rather than 4 core, jacked the ram to 16 GB and got 512 GB SSD.
The same processor, ram, and SSD on the iMac costed $720 more. Of course, I would have gained a really nice 27 inch monitor, a keyboard, and a mouse. But I sure don’t need another mouse and keyboard, so that leaves the monitor. But for about $320, I got a 43 inch LG TV to use as a monitor. This has about 2.5 times the area of the iMac screen! Quality would not be as good, but frankly, I don’t think I’ll generally notice. The fact that it is 4k is sufficient for my preferences anyway - clarity of text! Certainly looks great for movies too, but that is secondary for me. And besides the additional space, I can just inputs to use it as a TV if I want a special experience (normally I use a cheaper and smaller 1080p TV for watching television).
So, net savings of $400 and a much bigger screen. Oh, and while the iMac has 4 USB 3.1 ports (double what the Mini has), it only has 2 Thunderbolt 3 ports (half what the Mini has). While I’d love to have more USB 3.1 ports, giving a choice, I’d rather have the Thunderbolt and this will probably be even more important over time. Sure, right now I use an adaptor for one regular USB device, but that’s not a big deal.
iMac has a display port while the Mini has HDMI.
Now, looking forward to when I buy another computer, assuming the Mini continues to be updated (which isn’t a given), I only have to replace the computer, but keep the screen. Likewise, if I decide I want an even better screen, I only have to replace that. If the Mini isn’t updated then maybe I would have to look at an iMac again, but I’ll still have saved a considerable amount of money this time. And could still use the screen I use with the mini as another screen for the iMac if I wanted. The Mini could then be used as a server. I guess an old iMac could do that too, but would be much less convenient.
You could convert a USB C port to USB A with these $7 passive adapters.
Also I think every ThunderBolt 3 port (on a Mac) is also a DisplayPort (and therefore also HDMI), you just need a low cost passive adapter to do the job.
I don’t have those - but I did buy one which is also a short extender cable. Right now that is all I need.
From what I’ve read, thunderbolt port can handle two 4k monitors so if I ever need more than the HDMI, I should be fine.
I also looked at an iMac like I usually do - pushing the upgrades as long as the prices were not ridiculous. Which means, to take one example, I wouldn’t be upgrading to a 2 TB SSD! Or 32 GB memory…
But I would bump up the CPU. I wouldn’t go to the top graphics card because that was, I think, $450 more than the one below it. I’d do a 3 TB fusion drive. After all, I know that even an 1 TB drive won’t hold everything (iTunes stuff would still have to be on an external drive). That being the case, I might as well get a 3 TB Fusion drive which would hold everything and it has a 128 GB SSD section which would make most of my common stuff work fast.
All of that would put my discount price at a bit over $3000. Which is more than double what I paid. I really don’t think I’d see much advantage for all that extra money. Maybe someone heavy into producing video, etc, would, but not me.
I found a 43UK6500 at my local Sam’s Club Yesterday for $203 +tax. It now resides on my desk at work!! My laptop/dock can only drive it at 30 Hz, but it is still a significant step up from two 23” 1080 monitors.
Xbhambc - isn’t it amazing how much screen two hundred bucks buys now?
Once you start using the huge screen area, it’s very hard to go back to 24 inch monitors.
This change in the market is why it now makes more sense to buy your screen a-la-carte, rather than inside the computer box, like the iMac.
iMacs are great machines, but the change in the commodity 4K screen market is so dramatic, it’s now a better deal to buy a separate computer.
As long as they keep upgrading the Mini! If they were to let it die, then we’d be faced with the only non-imac desktop being a Mac Pro!
Wait’ll they migrate to the A-series CPU’s.
They are so power-efficient, Apple will soon be able to pack dozens of cores into a mini-sized package.
The mini market segment has lots of room to grow, for many years.
Good times for users.
Oh, I think there are all kinds of possibilities for the Mini. But I don’t necessarily have confidence that Apple won’t skip over it. And then there is the question of, if they do keep it going, how often are the upgrades (and how big). Because it really bugs me that Apple will keep the same thing on the market for years at the original price! I mean, I paid $999 for the 24 inch cinema display in 2009. I know they upgraded it to thunderbolt, but I don’t know if they did anything else to improve it. But right up to the point they stopped selling it, they were still selling at the same price as I recall.
So, supposed they only upgrade the mini every 4 years. Well, I may be just fine with waiting 5 or 6 years - at which point the latest is a year or two old and still the same price. It wouldn’t be so bad if we knew when they would be updated as then I could time it. But if I think they may have a new one at 4 and 8 years so I opt to skip the 4 year one and just stretch things out, it would be annoying if the next one wasn’t for another 6 years instead of 4!
That is one thing I like about iPhones. There may always be the question of whether a given update should be skipped in hopes of something better the next year, but at least it improves every year.
Speaking of which, that’s what I’m thinking about now. Normally I buy every 2 years, but improvements are less noteworthy as it matures. So I have the X. I skipped this year. Not sure if next year will be big enough change to justify buying. Three lenses would be great if it gave MUCH better zoom (not just 3x instead of 2x). But rumor is that the three lenses, even if they do what I want, will only be on the biggest phone. I also want that Pixel ability to take good photos is fairly dim light - but that should only require software since the Pixel only has one lens.
We think the future actually looks quite promising for the mini.
And the processor is at the “core” of this.
The reason the iPhone gets updated each year is that Apple designs their own processors.
They drive the pace of change because they control their own silicon architecture now.
The reason Macs had slowed down their updates is that intel slowed their pace of processor advances.
Apple’s decision to switch to their own A-series is a fantastic win for consumers.
It means better machines at a faster pace of innovation.
Going forward, the mini will evolve at a pace more like iPhone.
That’s a good thing.
Just imagine this scene a few years from now -
Pairing your new 96 core mini, with your 8K array of 1mm thin micro led wall tiles.
Connected wirelessly via WiGig to both your home mini and your iPhone XZ.
33 million pixels that cover a full wall in your home.
Shipped in a stack of 1 square foot tiles that you peel and stick on your wall.
Remember that LG tag line -
That’s a good point, though I think Apple could have put something in the mini for minor updates prior to this bigger one. I think part of the reason the phone keeps getting updated includes the fact that even at this point, there are still notable improvements they can make in a single year even if you just count the camera. Also, they are pushed into it because everyone else keeps updating. If Apple skipped a year at this point, there would be a lot of fallout, imo. But someday things may slow down even for phones.
But I think if they use their own chips for the Mini, that will help the pace.
Funny, because I’ve always thought (cost permitting) that it would be great to videos of beaches on my walls so it would look like I was outside. Figured it would be relaxing!
Well, I figured out in problem I had with the Mini and 4K TV. I had an issue with the color (being partly color blind makes it harder to evaluate). But I had a spreadsheet with cells filled with a shade of yellow. On my iMac screen it looked about the same as color printouts, but on my 4K TV the color looked more like orange or something. It was a very obvious difference even to my eyes.
Well, tonight I got to wondering about a particular picture setting. I had set it fior “games”. I forget the pros and cons of that, but when I changed it to “cinema”, it looked about right. The other options were clearly worse.
Don’t understand why colors would change so drastically but nice to know the solution. Later I’ll need to redo my research to see what advantages the “game” setting my have so I can decide what the best trade off is.
In case anyone is wondering about how many cores to get in a CPU, I found some info in Activity monitor on my Mac Mini that shows how much of the CPU is being used on each core (in 5% increments).
I got the 6 core version rather than the four core. My question was whether the stuff I normally use would be spread out over all 6. I knew some apps were designed to take advantage of all available cores and some weren’t. I didn’t know if lots of simpler apps would divide themselves across the cores.
At the moment, I have the following open:
BBEdit (half a dozen documents), Mail, iTunes, Safari (9 tabs), two windows of Firefox (20 total tabs), Numbers (5 documents), and Activity Monitor.
I almost always show all 6 cores with at least 5% activity. One one brief occasion only 4 cores were in use, but that was only for seconds.
As I type this, no core is using more than 10%, but some things surprise me. For example, simply by reloading a tab in Firefox, first two cores jumped to 25% and the rest were at least 15%.
I’ve done things where I’ve seen bigger jumps, but the main points are - most people probably don’t need 6 cores but, if you are like me, you’ll be glad to know that if you have 6, you are probably going to benefit if you even add in something that is more demanding.
Just ran a backup of my phone using iMazing. First time in a long time. At about the time it was done, I glanced at the CPU usage and all 6 were quite high - highest ones were probably around 80%!
There is absolutely nothing wrong with having some cores doing nothing. They power down to almost nothing, and they’re able to power back up and get busy in a tiny fraction of a second.
Where it’s incredibly important to have more cores is when there is something going on, and you’d still like your computer to operate normally. For example, the OS will occasionally kick off a daemon to visually inspect all of my photos, to try to put names to faces and what-not. This daemon is well designed to kick off when I’m busy doing something, to make sure that I know it’s running and doing it’s job (by slowing down my machine to the point that I have to “ps -ef |more” etc. and figure out WTF?!?)
With enough RAM, a flash drive (which can interleave multiple streams of requests without blocking), and plenty of cores, you never see this crap. It’s a luxury, but if your time is important to you, then it’s well worth it. I love my old trashcan Mac Pro for just this reason … it never makes me wait.
Well, I think I found a way to pretty much max out all 6 cores. I was trying an app for saving videos on the net and it can save up to 4k and convert to an mp4 from whatever the original is, if the video is at 4k quality to begin with.
My first tests were usually 1080p, but I’m doing a 7 minute 4k video right now and noticed all 6 cores are often at 90-95%! But as I continue my other work at the same time, things seem to be working fine. It is taking a lot of time though (1080p files were fast). It seems the conversion is the slow part.