• 6 Posts
  • 27 Comments
Joined 3Y ago
cake
Cake day: Oct 18, 2019

help-circle
rss

yeah, but living life with only the essential stuff would be kinda boring, don’t you think?





isn’t that stardew valley? 🤔



it’s really a miracle how all of this is held together tbh while being so cross-platform

the core engine of torch, which contains things like vector calculus (automatic differentiation), some tensor operations, data preprocessing, data de/serialisation, et cetera, is written in regular C++, so it basically runs on anything that a C++ compiler could target, which is basically everything

the problem starts when you want to add gpu acceleration in order to speed up things like matrix multiplication (which is typically the most computationally expensive part of the machine learning pipeline)

when torch (and other ml libs) started out, cuda was basically the most advanced, easiest to use lib for gpu compute (probably still is), and nvidia gpus were far superior to anything the competition could offer, and ml on mobile devices wasn’t a thing, so everyone went for it and for a long time ml existed almost solely on devices with nvidia graphics cards that could support cuda

then amd and arm started to catch up, and things like amd rocm was added to support amd gpus, vulkan was added to support both gpus on mobile devices and also nvidia and amd gpus, and at the moment all of this exists in this kind of mess, where practically all functionality is supported if you use cuda, but for rocm and vulkan a lot of things don’t work, and you often have to compile everything from scratch for things to be supported

and now all of this mess is wrapped in python to simplify the api, which was a big mistake in my opinion, bc not only is the api simplification unnecessary, but now if you want to target any specific architecture, it must be supported by the core torch engine, some version of a gpu compute lib (unless you want to do inference on the cpu, which you prolly don’t), and the python wrapper

so now, bc you want everything to work out of the box, all of these things are put into a binary, which results in this huge file size, and i imagine the maintenance of torch is pretty hard at least partially as a result of this

if you were building something like torch today, things are a lot more simple, bc you could just write the core engine in smth like C++, and then use smth like vulkan kompute, which is the name of a wrapper api around regular vulkan, but massively simpler and more user-friendly, and supports every gpu under the sun, and boom, you have an much more concise and easily maintainable library


basically torch is an huge lib in itself, and it targets not only virtually all cpu architectures, but also multiple gpu frameworks (cuda, roc, vulkan), all off which support thousands of gpus together, both desktop and mobile

and all of this is packaged into a single binary so that it works for everyone, regardless of hardware

if you want a smaller size, you can compile it from source for your specific architecture, or download minimised precompiled versions for your target architecture


torch, the ml lib by facebook i’m assuming :)



cross-posted from: https://lemmy.ml/post/258803 > https://archive.is/2022.05.05-125743/https://www.nytimes.com/2022/05/05/world/europe/russian-exiles-foreign-agents-putin.html
fedilink

iirc there was a discussion a long time ago here, like 1.5-2 years, about implementing something like that, with /b/username (b for blog), but decided on postponing it until a later day if i recall correctly

you can try to search for it :)


can’t wait for carnists to kill humanity bc they refuse to eat beans 🤪🥳


medium articles containing 800 bytes of text that weigh 7 megabytes will kill the environment long before that surely




fucking tragic, really hope this doesn’t go through

dear brits, i trust in your protest potential, if extradition is “officially” completely approved, pls don’t be silent, this is a once first in a lifetime precedent that’s going to establish this terrible practice


me explaining to nft owners how to open a url

⠀⠀⠘⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡜⠀⠀⠀ ⠀⠀⠀⠑⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡔⠁⠀⠀⠀ ⠀⠀⠀⠀⠈⠢⢄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⠴⠊⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀⠀⢸⠀⠀⠀⢀⣀⣀⣀⣀⣀⡀⠤⠄⠒⠈⠀⠀⠀⠀⠀⠀⠀⠀ ⠀⠀⠀⠀⠀⠀⠀⠘⣀⠄⠊⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀ ⠀ ⣿⣿⣿⣿⣿⣿⣿⣿⡿⠿⠛⠛⠛⠋⠉⠈⠉⠉⠉⠉⠛⠻⢿⣿⣿⣿⣿⣿⣿⣿ ⣿⣿⣿⣿⣿⡿⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠛⢿⣿⣿⣿⣿ ⣿⣿⣿⣿⡏⣀⠀⠀⠀⠀⠀⠀⠀⣀⣤⣤⣤⣄⡀⠀⠀⠀⠀⠀⠀⠀⠙⢿⣿⣿ ⣿⣿⣿⢏⣴⣿⣷⠀⠀⠀⠀⠀⢾⣿⣿⣿⣿⣿⣿⡆⠀⠀⠀⠀⠀⠀⠀⠈⣿⣿ ⣿⣿⣟⣾⣿⡟⠁⠀⠀⠀⠀⠀⢀⣾⣿⣿⣿⣿⣿⣷⢢⠀⠀⠀⠀⠀⠀⠀⢸⣿ ⣿⣿⣿⣿⣟⠀⡴⠄⠀⠀⠀⠀⠀⠀⠙⠻⣿⣿⣿⣿⣷⣄⠀⠀⠀⠀⠀⠀⠀⣿ ⣿⣿⣿⠟⠻⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠶⢴⣿⣿⣿⣿⣿⣧⠀⠀⠀⠀⠀⠀⣿ ⣿⣁⡀⠀⠀⢰⢠⣦⠀⠀⠀⠀⠀⠀⠀⠀⢀⣼⣿⣿⣿⣿⣿⡄⠀⣴⣶⣿⡄⣿ ⣿⡋⠀⠀⠀⠎⢸⣿⡆⠀⠀⠀⠀⠀⠀⣴⣿⣿⣿⣿⣿⣿⣿⠗⢘⣿⣟⠛⠿⣼ ⣿⣿⠋⢀⡌⢰⣿⡿⢿⡀⠀⠀⠀⠀⠀⠙⠿⣿⣿⣿⣿⣿⡇⠀⢸⣿⣿⣧⢀⣼ ⣿⣿⣷⢻⠄⠘⠛⠋⠛⠃⠀⠀⠀⠀⠀⢿⣧⠈⠉⠙⠛⠋⠀⠀⠀⣿⣿⣿⣿⣿ ⣿⣿⣧⠀⠈⢸⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠟⠀⠀⠀⠀⢀⢃⠀⠀⢸⣿⣿⣿⣿ ⣿⣿⡿⠀⠴⢗⣠⣤⣴⡶⠶⠖⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡸⠀⣿⣿⣿⣿ ⣿⣿⣿⡀⢠⣾⣿⠏⠀⠠⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠛⠉⠀⣿⣿⣿⣿ ⣿⣿⣿⣧⠈⢹⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣰⣿⣿⣿⣿ ⣿⣿⣿⣿⡄⠈⠃⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣠⣴⣾⣿⣿⣿⣿⣿ ⣿⣿⣿⣿⣧⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣠⣾⣿⣿⣿⣿⣿⣿⣿⣿⣿ ⣿⣿⣿⣿⣷⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣴⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿ ⣿⣿⣿⣿⣿⣦⣄⣀⣀⣀⣀⠀⠀⠀⠀⠘⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿ ⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣷⡄⠀⠀⠀⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿ ⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣧⠀⠀⠀⠙⣿⣿⡟⢻⣿⣿⣿⣿⣿⣿⣿⣿⣿ ⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⠇⠀⠁⠀⠀⠹⣿⠃⠀⣿⣿⣿⣿⣿⣿⣿⣿⣿ ⣿⣿⣿⣿⣿⣿⣿⣿⡿⠛⣿⣿⠀⠀⠀⠀⠀⠀⠀⠀⢐⣿⣿⣿⣿⣿⣿⣿⣿⣿ ⣿⣿⣿⣿⠿⠛⠉⠉⠁⠀⢻⣿⡇⠀⠀⠀⠀⠀⠀⢀⠈⣿⣿⡿⠉⠛⠛⠛⠉⠉ ⣿⡿⠋⠁⠀⠀⢀⣀⣠⡴⣸⣿⣇⡄⠀⠀⠀⠀⢀⡿⠄⠙⠛⠀⣀⣠⣤⣤⠄


forgot to mention, there’s a 20% value added tax on buying gold, making this option even less attractive


i’m afraid the two necessarily really tied, but that said, usd does literally sound like a scamcoin nowadays lol 😬


tor relay definitely 😎

or if you don’t want to bother with that, you can literally just download firefox and install the snowflake extension


not really i’m afraid, mainly because the average russian has literally no savings, and among the ones that do the average amount of savings will buy you like 20-30 grams of gold, which isn’t an amount you can buy iiuc, plus there are issues with storage, and liquidity

ppl who have enough savings to buy into gold just keep their money in usd or some other foreign currency 🤷‍♀️


though it’s possible that this is simply efforts by the central bank to delay the inevitable crash in value 🤔


that’s great news honestly 🥳

i don’t have any savings, but some of my friends do, and ppl seeing their savings basically halved within a few days were devastated, and anxiously wondered whether they should rush to convert to usd or euros to mitigate at least some of the losses

i obviously didn’t know how things would turn out, but said that they should probably hold on, bc buying at like 110-120 per usd would be awful, but it didn’t really seem that crazy at the time, bc there were massive lines to get cash (both roubles and usd) at atms, and ppl speculated roubles per usd would reach 200 or smth crazy like that

there is still the issue that a lot of things basically skyrocketed in price, which is a massive problem, but at least it’s not compounded by shitty currency conversion rates 🤷‍♀️