this post was submitted on 24 Jul 2022
53 points (93.4% liked)

Programmer Humor

32396 readers
499 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS
 
top 8 comments
sorted by: hot top controversial new old
[–] Lakso@ttrpg.network 11 points 1 year ago

Lua is just a based language. It has strong unpopular opinions and doesn't care what you think.

[–] Puffymumpkins@lemmy.world 2 points 1 year ago (1 children)

This is correct. The natural numbers are better known as the counting numbers, and do not include 0 or the negative numbers. I personally think the classification is a bit bullshit given how they really only matter in meatspace. I have never seen the distinction between integers and natural numbers come up in real mathematics

[–] CHINESEBOTTROLL@sh.itjust.works 1 points 1 year ago (1 children)

Idk if you are trolling, but in most cases 0 is considered a part of the natural numbers. And there is a huge difference between the naturals and the integers: the naturals are for induction, the integers are for algebra.

[–] Lakso@ttrpg.network 1 points 1 year ago (1 children)

Depends on where you are! In some places it is more common to say that 0 is natural and in other's not. Some argue it's useful to have it N, some, say that it makes more historical and logical sense for 0 not to be in N and use N_0 when including it. It's not a settled issue, it's a matter of perspective.

[–] CHINESEBOTTROLL@sh.itjust.works 2 points 1 year ago* (last edited 1 year ago) (1 children)

I guess it depends on the place. But the arguments for not including seem futile, when

  • we use 0 to even write the other natural numbers
  • we define almost all of our algebraic objects (groups, rings/fields, modules/vector spaces, algebras) to include 0
  • we don't do modular arithmetic with {1,...,n} that would be crazy

Of course 0 vs no 0 only matters if you actually do arithmetic with it. If you only index you could just as well start with 5.

(The only reasons I can think of to start at 1 is that 1 is the 1-st element then and the sequence (1/n) is defined for all natural n)

[–] Lakso@ttrpg.network 0 points 1 year ago* (last edited 1 year ago) (1 children)

Those are valid points and make some practical sense, but I've talked too much with mathematicians about this so let me give you another point of view.

First of all, we do modular arithmetic with integers, not natural numbers, same with all those objects you listed.

On the first point, we are not talking about 0 as a digit but as a number. The main argument against 0 being in N is more a philosophical one. What are we looking at when we study N? What is this set? "The integers starting from 0" seems a bit of a weird definition. Historically, the natural numbers always were the counting numbers, and that doesn't include 0 because you can't have 0 apples, so when we talk about N we're talking about the counting numbers. That's just the consensus where I'm from, if it's more practical to include 0 in whatever you are doing, you use N~0~. Also the axiomatization of N is more natural that way IMO.

I've talked too much with mathematicians

You are talking to one right now :) (not sure if a bachelors degree is enough to call yourself one)

you can't have 0 apples

You can actually. In fact, right at this moment I have 0 apples. If 0 is not natural, then you have no way of describing the number of apples I have.

There are a lot of concepts (degree of a polynomial, dimension of a space, cardinality of a set, in a graph) where 0 is a natural possibility.

So I think 1 indexing is fine, I use it all the time, but to me 0 belongs with the natutals. I will say tho, that 0 does not make sense to me as an ordinal. "He finished the race in 0-th place"????

[–] sandayle@iusearchlinux.fyi 1 points 1 year ago

What about 10?

load more comments
view more: next ›