i realize it is extreme boomer cringe of me to think like this, but i mean i cut my teeth streaming ETL because we did not have enough ram to read them and disks were slow to read and write. i get this is the age of computing abundance but the resource stinginess is in my bones
1
12
It’s not boomer cringe. It really is a mess out here.
1
5
like i get that worrying about ram and cpu cycles is probably dumb but this fucking thing loaded 113 packages exporting 8300 names and 325 names have collisions. sorry to boomer but like messy bessy, clean up this mess!
2
6
Living in similar hell myself. One of the reasons all new platforms (node, ruby etc) become unusable is because they’d pull multiple gigabytes of libraries to do a single function call. I am guessing it’s the quality of devs involved in eco system that creates problem.
2
4
Welcome to the tidyverse
1
3
the hadleyverse is not that bad. i am absolutely willing to eat the overhead of rlang/purrr all of the time and chances are i am also happy to use dplyr/tidyr (nfw am i loading the tidyverse metapackage) but tidymodels is a fucking mess
2
1
library(tidyverse) basically half as much clutter as library(tidymodels). i have suffered enough frustration that i can justify vctrs/rlang overhead. but yeah i am the guy that insists on calling functions like::this
1
1
Yeah, it wouldn't be nearly as bad. And your numbers include all the base packages too. It wouldn't surprise me if they have more exports than the packages attached by library(tidyverse). Run base_pkgs <- search() And then exclude those packages from your export counts.
1
2
Have you looked into `mlr3` as an alternative?

Oct 24, 2022 · 12:16 AM UTC

1
3
look, feel free to call me a hardo, but when i looked into mlr3 the examples assigned variables in the global env with = instead of <- which turned me off. some customs need to followed even if only to signal to the community that you respect their customs
1