I'm a math/compsci major (or, at least, almost; I graduate in April), and I've spent a fair amount of time TAing and tutoring math, so this discussion has been fascinating.
I mean, not only purpose, as others have mentioned, but I'm inclined to agree at least somewhat with this mindset. Math definitions, especially, can vary slightly from source to source depending on what the author finds to be most useful in a specific context. And this conversation seems to hinge a lot on definitions.
The crux of the issue is this quote. This is something that others have debated you on, but I'll give it one last shot. In fact, I think I see a way to interpret this that does in fact seem logically consistent... but I'll discuss why calling this "division" is maybe not the best idea regardless.
First, we need to be in agreement on what a "cut" entails. Since this is non-realtime communication, I'll take my best guess at how you'd define it. If the denominator of your fraction is the number of cuts, then how come 1/1 is not half? How are you defining a "cut" so that one cut still leaves you with a whole pizza?
The only way I can think of to define the fraction by the number of cuts and not encounter massive headaches, e.g. the 1/1 = half issue above, is the following way. And even then, there are still sacrifices to make.
We define a "cut" as an operation that takes a pizza cutter to the centre of the pizza, pushes it down, and then drags it out to the edge.
Then 1 cut results in a pizza that looks like the matoran symbol for "1" (the matoran numeric symbols are a great analogy for this definition of cutting), and what results is still one whole (if deformed) pizza, and so 1/1 = 1 and we are good.
A second (non-overlapping) cut will divide the pizza into two separate slices, no matter what angle it's made at. Of course, the slices will not be of equal sizes if you choose any angle except 180 degrees from the first cut, but there's no way to consistently cut a pizza so that after two cuts you'll get two equal halves, and then with one additional cut you'll get three equal thirds, so we'll have to stop caring about slice sizes if we want to use this definition. If we do that, then we'll say all that matters is we have two (potentially non-equal) "halves" of a pizza, and so 1/2 seems to check out. With three cuts, we'll split one of our "halves" into two pieces, meaning we'll have 2+1 = 3 separate pieces in total now. We can call these (potentially non-equal) pieces "thirds" and then 1/3 seems to check out. Etc.
[Update: Reading this over after submitting, I realize one may argue that the cuts don't have to be consistent if you think of "cut" not as each step in arriving at, say, 72 slices of pizza, i.e. an operator on a single input, successively applied, but rather as a function that requires how many cuts you're making in total as one of its two inputs. I guess it's hard to argue with this (two inputs makes more sense for division), so you can ignore the "equal size" issues above and in the rest of this post; there are still other issues that remain]
With this definition, there is now a real-life distinction between cutting 0 times and cutting 1 times, while still letting 1/1 = 1 and still letting 1/0 = 1. But... should this "/" operation be called division? Is that the most logical way to think about this? As said before, there's no way we can cut consistently at the same angle each time and always end up with equal-sized slices. Can we really call these "halves" 1/2 a pizza then? Can we call these "thirds" 1/3 of a pizza? And since this whole conversation is about real world practicality, how practical is "1 cut" in this definition? All we do is slightly deform the pizza without splitting it. Is being able to tell a person to "divide that pizza there by zero" a practical, real-world gain?
Furthermore, let's leave the natural numbers. What is "half a cut" here? A cut that starts at the centre, and only makes it halfway to the edge? Does that mean 1/0.5 = 1? Is that what we want from our definition of "division"? If so, we've warped our "division" so much that perhaps we should call it something else. What about dividing by -1? Perhaps we restrict "division" to only ever take natural number inputs, but that still feels wrong, because we're allowed rational outputs, which means our "set" of inputs is no longer closed under our "division" operation. Is that a desirable property?
Not to mention that division is usually best viewed as the inverse of multiplication. What is multiplication here? How about gluing together pizza slices with some healthy melted cheese. E.g. two halves make a whole (2 * 1/2 = 1), three thirds makes a whole (3 * 1/3 = 1). What about 2 * 1/3? Since size equality could not be guaranteed with this definition of cutting (one of our sacrifices) we can't have a pizza split into 1/3 and 2/3. We called these "halves", so 2 * 1/3 = 1/2. That could be a problem. We also how have 0*1 = 1, which many would interpret as "wrong".
TL;DR you can define a (non-injective) cut function that works as you describe without any logical inconsistencies, but does relating this function with "division" actually "solve problems" or have any worthwhile meaning? Perhaps thinking about splitting it into equal groups is a better definition, in which case you get the "zero groups is nonsensical" issue that others have mentioned, but a whole lot of intuitive benefits.
If you divide a pizza by , the pizza also doesn't become undefined. That doesn't mean you have a sensible answer for 1/.
@Waj Your response reminded me of this. Not in the sense that yours was a bad explanation (I think it was really good and helpful); just that "explaining fractions using abstract algebra" was an amusing similarity.