IDK, how are we counting? Digestible calories? I don’t think you are getting much energy from any amount of swords that you can fit in your stomach.
IDK, how are we counting? Digestible calories? I don’t think you are getting much energy from any amount of swords that you can fit in your stomach.
I’m pretty sure every microwave just splits the input in to the last to digits as a number of seconds and the digits before that as minutes. Then runs for 60 * minutes + seconds
. So 0:99 is equivalent to 1:39 and 1:80 is equivalent to 2:20. I mean it is a little weird that the seconds can be >59 and extra weird that you can do 6:66 but it isn’t exactly wizardry.
YAML is fine as a configuration language and ok data input language.
YAML is absolutely cursed as a programming language. As in Ansible has created a really shitty programming language inside of YAML. Should be burned with fire.
The short answer is that Docker (and other containerization technologies) share the Linux kernel with the host. The Linux kernel is very complicated and shouldn’t be trusted to be vulnerability free. Exploitable bugs are regularly discovered in the Linux kernel (and Windows and Darwin). No serious companies separate different tenets with just container technology. Look at GCP, AWS, DigitalOcean… they all use hardware virtualization which is much simpler and much more likely to be secure (but even then bugs are found on occasion).
So in theory it is secure, but it is just too complex to rely on. I say that docker is good for “mostly trusted” isolation. Different organizations in the same companies, different software that isn’t actively trying to be malicious. But shouldn’t be used to separate different untrusted parties.
I hope they are using more than just docker for isolation 😅 Each user should be running in a different VM for security.
Strongly reminds me of Old MacDonald Had a Barcode, E-I-E-I CAR. Basically put a standard anti-virus test string into various sorts of barcode and see what breaks.
it’s mostly solved already
I wished I believe this. Or I guess I agree that it is solved in most software but there is lots of commonly used software where it isn’t. One broken bit of software can fairly easily take down a whole site or OS.
Try to create an event in 2040 in your favourite calendar. There is a decent chance it isn’t supported. I would say most calendar servers support it, but the frontends often don’t or vice-versa.
I wouldn’t call a nail hard to use because I don’t have a hammer. Yes, you need the right hardware, but there is no difference in the difficulty. But I understand what you are trying to say, just wanted to clarify that it wasn’t hard, just not widespread yet.
which is hard to decode using hardware acceleration
This is a little misleading. There is nothing fundamental about AV1 that makes it hard to decode, support is just not widespread yet (mostly because it is a relatively new codec).
Just to be clear it is probably a good thing that YouTube re-encodes all videos. Videos are a highly complex format and decoders are prone to security vulnerabilities. By transcoding everything (in a controlled sandbox) YouTube takes most of this risk on and makes it highly unlikely that the resulting video that they serve to the general public is able to exploit any bugs in decoders.
Plus YouTube serves videos in a variety of formats and resolutions (and now different bitrates within a resolution). So even if they did try to preserve the original encoding where possible you wouldn’t get it most of the time because there is a better match for your device.
From my experience it doesn’t matter if there is an “Enhanced Bitrate” option or not. My assumption is that around the time that they added this option they dropped the regular 1080p bitrate for all videos. However they likely didn’t eagerly re-encode old videos. So old videos still look OK for “1080p” but newer videos look trash whether or not the “1080p Enhanced Bitrate” option is available.
It may be worth right-clicking the video and choosing “Stats for Nerds” this will show you the video codec being used. For me 1080p is typically VP9 while 4k is usually AV1. Since AV1 is a newer codec it is quite likely that you don’t have hardware decoding support.
I’m pretty sure that YouTube has been compressing videos harder in general. This loosely correlates with their release of the “1080p Enhanced Bitrate” option. But even 4k videos seem to have gotten worse to my eyes.
Watching a higher resolution is definitely a valid strategy. Optimal video compression is very complicated and while compressing at the native resolution is more efficient you can only go so far with less bits. Since the higher resolution versions have higher bitrates they just fundamentally have more data available and will give an overall better picture. If you are worried about possible fuzziness you can try using 4k rather than 1440p as it is a clean doubling of 1080p so you won’t lose any crisp edges.
I like trains because they are much more comfortable than airplanes (especially with TSA gymnastics) and I can do something interesting or productive rather than try to focus on the road for hours on end.
More like re-purchase it for twice the price on the next console’s eShop
This is how I feel when my partner is using their phone at night.
These units hurt me. For others with the same pain 20 oz is a bit over 1/2 a liter
If you are Canadian Kawartha Dairy has very good mint chocolate chip that is reasonably priced. It isn’t quite the best I’ve ever had but at not much more than the price of the cheap stuff at the grocery store it is perfect to have in the freezer for an “every-day” treat.
Is the limit 2 VMs or two macOS VMs? I thought it was technically a “licensing” restriction.