I’m referring to BIG storage, private clouds, data lakes, etc. For example, my primary customer, In three years we’ve grown the object storage footprint by 100 petabytes. The rest of the global footprint across 110 sites is another 95PB. Commodity services do not scale, and global data transmission is typically custom tailored to the user requirements. Thinks like a 1st pass at the edge in 15 remote test sites, each crunching 100TB of raw data down to 10TB for transmission back to core, and that process happens on a clock. Other binary distribution uses cases, transmitting 50GB jobs from other continents back to core for analysis. It’s all still custom. Then there’s all the API back end work, to build out all the customer accessible storage APIs, numerous challenges there.
Storage Engineer, Storage Consultant, Storage Architect
then mix in netapp, pure , dell emc, ecs, storage grid, cleversafe, etc.