@adhesive_wombat 13d
You can still have both: an algorithm that technically has no formal numerical limit, but have warnings or errors when it reaches some condition that is clearly outside the design scope.

For example, you might have a system that warns somehow if you start inserting items at the front of a vector with over a million items (when you originally expected the vector to be used for tens of items). If the structure is suddenly being used for something way outside of it's design scope, it's probably in need of a re-think.

@TazeTSchnitzel 13d
You can see this kind of design in action in the Source engine, perhaps inherited from Quake (which is partly Carmack's work of course). There are hard limits on things like how many objects can exist at once. Occasionally those get hit… usually due to a memory leak![0] So the artificial limit helps find real problems. That's also been my experience with PHP, which limits the amount of memory a request or script execution can consume by default.

[0] https://www.youtube.com/watch?v=pw2X1yhrDdE — check out shounic's other videos too, there's lots of interesting content there about how complex game systems fail, including about these limits

@cjfd 13d
Well, John Carmack is a game programmer, at least originally. In games one has strong upper limits on how long it can take to calculate the next frame. So, it may very well be that something can be said for this in this context. I am not sure in general, though. If one limits the length of the name of a person in some record one can start waiting until one day a person arrives that has a name that is one character longer and then a programmer needs to get involved to increase the limit. Okay, maybe there exists a ridiculously large maximum name length such that this problem is unlikely to occur. Then one may still have the problem that if all people would have names that approach this limit, the system still might get too slow as well. So maybe we should impose a limit on the sum of all the lengths of all the names in the system. Hmmm... Kind of gets a bit too complicated and arbitrary for my tastes. I think for many systems Carmarcks advice really is not very good.
@shoo 13d
> when you design software you should have an idea under what circumstances it will run and optimize for

This philosophy is sometimes referred to as "data-oriented design"

see also: Mike Acton's CppCon 2014 talk: https://www.youtube.com/watch?v=rX0ItVEVjHc

Data-oriented design seems appropriate for shipping high-performance games, and shipping them on schedule and on budget. The goal is not to design and code the algorithm with the best theoretical scalability characteristics - and publish it in a paper, or invent the most mathematically general version of the algorithm to be factored out into a library for use by 100 hypothetical future projects. The goal is not to design a system using hexagonal architecture (or whatever) that can more flexibly accommodate arbitrary changes of requirements in future, at the expense of performance. The source code is not the product that the customers buy, the game is the product.

The goal is to understand the problem you actually have - processing data of a certain finite size or distribution of sizes, on some fixed finite set of target hardware platforms, and figure out how to do this in a way that hits performance targets, with a limited budget of engineering time, to keep moving forward on schedule toward shipping the product.

@kristopolous 12d
"Generalized problems are the hardest, so why needlessly pretend like you have them when you don't?"
@quelsolaar 13d
Ive come to agree with this. Say you have a struct that contains a name. If you limit the name size to say 64 bytes, then you can store it in the struct, otherwise you need to have a separate allocation and an indirection. This makes the code slower, more error prone and more complex to use. So think hard of when “infinite” is justified.
@MawKKe 13d
Sure if you know the operating domain, just hard code some sensible upper limit and move on. Not that this way is "better" in all cases. YAGNI and all that
@roflyear 13d
Only works when fail early is better than run slow which IME is not often the case
@mhh__ 13d
> 1m keys

As opposed to not working at all?