What makes the pairs of operators $(-, +)$ and $(÷, ×)$ so similar?












12














Update



My original thoughts are better expressed on this mathoverflow post.



Short Version



When defining the $-$, $+$, $÷$, and $×$ operators in a functional manner, one can observe that the $(-, +)$ pair is very similar to the $(÷, ×)$ pair, and the only main differences between them is their identity terms ($0$ and $1$ respectively) and the fact that the divisor cannot be equal to the identity term of the $(-, +)$ pair of operators.



My questions are the following: where can I find some prior work on this topic, and can one define a family of such operator pairs with different identity terms? Is there any theory for such objects?



Set Theory Version



While the arithmetic properties outlined below can be defined for both sets and types, referring to set theory might help clarify the question: if $(+, -)$ with 0 as identity element defines a group and $[(+, -), (×, ÷)]$ with 1 as identity element for $(×, ÷)$ defines a field, what is defined by $[(+, -), (×, ÷), (#, @)]$ with an identity element for $(#, @)$ other than 0 and 1?



Since $(+, -, 0)$ is used to define $mathbb{Z}$ and $[(+, -, 0), (×, ÷, 1)]$ is used to define $mathbb{Q}$, which $(#, @, r)$ could be introduced so that $[(+, -, 0), (×, ÷, 1), (#, @, r)]$ would define $mathbb{S}$, with $mathbb{Q} subset mathbb{S} subseteq mathbb{R}$?



Intuitively, $#$ should be based on exponentiation, while $@$ should be based on logarithm.



Long Version



One can define the $-$, $+$, $÷$, and $×$ operators in the following fashion:



Minus:




$
small text{Minus Identity Term: the minus identity term is equal to 0.}normalsize\
i(m) = 0.\
quad\
small text{Subtraction Identity:} enspace alpha - 0 = alpha.normalsize\
m(alpha, i(m)) = alpha.\
quad\
small text{Self Subtraction:} enspace alpha = beta Longleftrightarrow alpha - beta = 0.normalsize\
alpha = beta Longleftrightarrow m(alpha, beta) = i(m).\
quad\
small text{Subtraction Affine Identity:} enspace alpha - (beta - gamma) = gamma - (beta - alpha).normalsize\
m(alpha, m(beta, gamma)) = m(gamma, m(beta, alpha)).\
$




Plus:




$
small text{Multiplication Affine Identity:} enspace (alpha + beta) - gamma = alpha - (gamma - beta).normalsize\
m(p(alpha, beta), gamma) = m(alpha, m(gamma, beta)).\
$




Divides:




$
small text{Divides Identity Term: the divides identity term is equal to 1.} normalsize\
i(d) = 1.\
quad\
small text{Division Identity:} enspace frac{alpha}{1} = alpha.normalsize\
d(alpha, i(d)) = alpha.\
quad\
small text{Self Division:} enspace frac{alpha}{alpha} = 1.normalsize\
alpha = beta Longleftrightarrow d(alpha, beta) = i(d).\
quad\
small text{Division Affine Identity:} enspace frac{alpha}{frac{beta}{gamma}} = frac{gamma}{frac{beta}{alpha}}.normalsize\
d(alpha, d(beta, gamma)) = d(gamma, d(beta, alpha)).\
$




Times:




$
small text{Multiplication Affine Identity:} enspace frac{alpha × beta}{gamma} = frac{alpha}{frac{gamma}{beta}}.normalsize\
d(t(alpha, beta), gamma) = d(alpha, d(gamma, beta)).\
$




We observe that the pair of divides and times operators are defined exactly the same way as the pair of minus and plus operators, but with different identity terms, and with a minus identity restriction on the multiplier subdomain of the divides function.



The symmetry established between the pairs of operators $(-, +)$ and $(÷, ×)$ allows the following pairs of properties to be proven for both properties in every pair by proving it for a single property.



The following properties are established for any pair of operator functions $(f, g)$, which corresponds to the pairs $(-, +)$ and $(÷, ×)$. Furthermore, the term reverse is used to refer to the opposite for the $(-, +)$ pair and to the inverse for the $(÷, ×)$ pair.



Proofs for the $(-, +)$ pair can be found on this notebook.



Anticommutativity: $f(alpha, beta) = f(i(f), f(beta, alpha).$




$
alpha - beta = -(beta - alpha).\
quad\
displaystyle frac{alpha}{beta} = frac{1}{frac{beta}{alpha}}.\
$




Double Reverse Identity: $alpha = f(i(f), f(i(f), alpha)).$




$
alpha = -(-alpha).\
quad\
displaystyle alpha = frac{1}{frac{1}{alpha}}.\
$




Associative Commutativity: $f(f(alpha, beta), gamma) = f(f(alpha, gamma), beta).$




$
(alpha - beta) - gamma = (alpha - gamma) - beta.\
quad\
displaystyle frac{frac{alpha}{beta}}{gamma} = frac{frac{alpha}{gamma}}{beta}.\
$




Affine Equivalence: $f(alpha, beta) = gamma Longleftrightarrow f(alpha, gamma) = beta.$




$
alpha - beta = gamma Longleftrightarrow alpha - gamma = beta.\
quad\
displaystyle frac{alpha}{beta} = gamma Longleftrightarrow frac{alpha}{gamma} = beta.\
$




Identity Element: $g(alpha, i(f)) = alpha.$




$
alpha + 0 = alpha.\
quad\
alpha × 1 = alpha.\
$




Dual Substitution: $g(alpha, beta) = f(alpha, f(i(f), beta)).$




$
alpha + beta = alpha - (-beta).\
quad\
alpha × beta = frac{alpha}{frac{1}{beta}}.\
$




Dual Equivalence: $alpha = g(beta, gamma) Longleftrightarrow beta = f(alpha, gamma).$




$
alpha = beta + gamma Longleftrightarrow beta = alpha - gamma.\
quad\
alpha = beta × gamma Longleftrightarrow beta = frac{alpha}{gamma}.\
$




Commutativity: $g(alpha, beta) = g(beta, alpha).$




$
alpha + beta = beta + alpha.\
quad\
alpha × beta = beta × alpha.\
$




Associativity: $g(g(alpha, beta), gamma) = g(alpha, g(beta, gamma)).$




$
(alpha + beta) + gamma = alpha + (beta + gamma).\
quad\
(alpha × beta) × gamma = alpha × (beta × gamma).\
$




Dual Identity: $(g(f(alpha, beta), beta) = alpha) land (f(g(alpha, beta), beta) = alpha).$




$
((alpha - beta) + beta = alpha) land ((alpha + beta) - beta = alpha).\
quad\
displaystyle (frac{alpha}{gamma} × beta = alpha) land (frac{alpha × beta}{beta} = alpha).\
$











share|cite|improve this question




















  • 1




    There is a mistake in the anticommutativity part for subtraction
    – Slugger
    Jan 4 at 19:15






  • 7




    Have you heard of groups?
    – jgon
    Jan 4 at 19:16










  • @Slugger Sorry for the mistake and thanks a lot for the fix!
    – ismael
    Jan 4 at 19:19






  • 2




    @ismael No problem! You also say that "the only main difference between them is their identity elements (0 and 1 respectively)", but keep in mind that another significant difference is that multiplicative inverses are not defined for all numbers, i.e., for zero
    – Slugger
    Jan 4 at 19:21








  • 1




    The title might be even better with a minor change to "What makes the pairs of operators $(-, +)$ and $(div , times)$ so similar?" so they also look similar
    – Henry
    Jan 4 at 22:48
















12














Update



My original thoughts are better expressed on this mathoverflow post.



Short Version



When defining the $-$, $+$, $÷$, and $×$ operators in a functional manner, one can observe that the $(-, +)$ pair is very similar to the $(÷, ×)$ pair, and the only main differences between them is their identity terms ($0$ and $1$ respectively) and the fact that the divisor cannot be equal to the identity term of the $(-, +)$ pair of operators.



My questions are the following: where can I find some prior work on this topic, and can one define a family of such operator pairs with different identity terms? Is there any theory for such objects?



Set Theory Version



While the arithmetic properties outlined below can be defined for both sets and types, referring to set theory might help clarify the question: if $(+, -)$ with 0 as identity element defines a group and $[(+, -), (×, ÷)]$ with 1 as identity element for $(×, ÷)$ defines a field, what is defined by $[(+, -), (×, ÷), (#, @)]$ with an identity element for $(#, @)$ other than 0 and 1?



Since $(+, -, 0)$ is used to define $mathbb{Z}$ and $[(+, -, 0), (×, ÷, 1)]$ is used to define $mathbb{Q}$, which $(#, @, r)$ could be introduced so that $[(+, -, 0), (×, ÷, 1), (#, @, r)]$ would define $mathbb{S}$, with $mathbb{Q} subset mathbb{S} subseteq mathbb{R}$?



Intuitively, $#$ should be based on exponentiation, while $@$ should be based on logarithm.



Long Version



One can define the $-$, $+$, $÷$, and $×$ operators in the following fashion:



Minus:




$
small text{Minus Identity Term: the minus identity term is equal to 0.}normalsize\
i(m) = 0.\
quad\
small text{Subtraction Identity:} enspace alpha - 0 = alpha.normalsize\
m(alpha, i(m)) = alpha.\
quad\
small text{Self Subtraction:} enspace alpha = beta Longleftrightarrow alpha - beta = 0.normalsize\
alpha = beta Longleftrightarrow m(alpha, beta) = i(m).\
quad\
small text{Subtraction Affine Identity:} enspace alpha - (beta - gamma) = gamma - (beta - alpha).normalsize\
m(alpha, m(beta, gamma)) = m(gamma, m(beta, alpha)).\
$




Plus:




$
small text{Multiplication Affine Identity:} enspace (alpha + beta) - gamma = alpha - (gamma - beta).normalsize\
m(p(alpha, beta), gamma) = m(alpha, m(gamma, beta)).\
$




Divides:




$
small text{Divides Identity Term: the divides identity term is equal to 1.} normalsize\
i(d) = 1.\
quad\
small text{Division Identity:} enspace frac{alpha}{1} = alpha.normalsize\
d(alpha, i(d)) = alpha.\
quad\
small text{Self Division:} enspace frac{alpha}{alpha} = 1.normalsize\
alpha = beta Longleftrightarrow d(alpha, beta) = i(d).\
quad\
small text{Division Affine Identity:} enspace frac{alpha}{frac{beta}{gamma}} = frac{gamma}{frac{beta}{alpha}}.normalsize\
d(alpha, d(beta, gamma)) = d(gamma, d(beta, alpha)).\
$




Times:




$
small text{Multiplication Affine Identity:} enspace frac{alpha × beta}{gamma} = frac{alpha}{frac{gamma}{beta}}.normalsize\
d(t(alpha, beta), gamma) = d(alpha, d(gamma, beta)).\
$




We observe that the pair of divides and times operators are defined exactly the same way as the pair of minus and plus operators, but with different identity terms, and with a minus identity restriction on the multiplier subdomain of the divides function.



The symmetry established between the pairs of operators $(-, +)$ and $(÷, ×)$ allows the following pairs of properties to be proven for both properties in every pair by proving it for a single property.



The following properties are established for any pair of operator functions $(f, g)$, which corresponds to the pairs $(-, +)$ and $(÷, ×)$. Furthermore, the term reverse is used to refer to the opposite for the $(-, +)$ pair and to the inverse for the $(÷, ×)$ pair.



Proofs for the $(-, +)$ pair can be found on this notebook.



Anticommutativity: $f(alpha, beta) = f(i(f), f(beta, alpha).$




$
alpha - beta = -(beta - alpha).\
quad\
displaystyle frac{alpha}{beta} = frac{1}{frac{beta}{alpha}}.\
$




Double Reverse Identity: $alpha = f(i(f), f(i(f), alpha)).$




$
alpha = -(-alpha).\
quad\
displaystyle alpha = frac{1}{frac{1}{alpha}}.\
$




Associative Commutativity: $f(f(alpha, beta), gamma) = f(f(alpha, gamma), beta).$




$
(alpha - beta) - gamma = (alpha - gamma) - beta.\
quad\
displaystyle frac{frac{alpha}{beta}}{gamma} = frac{frac{alpha}{gamma}}{beta}.\
$




Affine Equivalence: $f(alpha, beta) = gamma Longleftrightarrow f(alpha, gamma) = beta.$




$
alpha - beta = gamma Longleftrightarrow alpha - gamma = beta.\
quad\
displaystyle frac{alpha}{beta} = gamma Longleftrightarrow frac{alpha}{gamma} = beta.\
$




Identity Element: $g(alpha, i(f)) = alpha.$




$
alpha + 0 = alpha.\
quad\
alpha × 1 = alpha.\
$




Dual Substitution: $g(alpha, beta) = f(alpha, f(i(f), beta)).$




$
alpha + beta = alpha - (-beta).\
quad\
alpha × beta = frac{alpha}{frac{1}{beta}}.\
$




Dual Equivalence: $alpha = g(beta, gamma) Longleftrightarrow beta = f(alpha, gamma).$




$
alpha = beta + gamma Longleftrightarrow beta = alpha - gamma.\
quad\
alpha = beta × gamma Longleftrightarrow beta = frac{alpha}{gamma}.\
$




Commutativity: $g(alpha, beta) = g(beta, alpha).$




$
alpha + beta = beta + alpha.\
quad\
alpha × beta = beta × alpha.\
$




Associativity: $g(g(alpha, beta), gamma) = g(alpha, g(beta, gamma)).$




$
(alpha + beta) + gamma = alpha + (beta + gamma).\
quad\
(alpha × beta) × gamma = alpha × (beta × gamma).\
$




Dual Identity: $(g(f(alpha, beta), beta) = alpha) land (f(g(alpha, beta), beta) = alpha).$




$
((alpha - beta) + beta = alpha) land ((alpha + beta) - beta = alpha).\
quad\
displaystyle (frac{alpha}{gamma} × beta = alpha) land (frac{alpha × beta}{beta} = alpha).\
$











share|cite|improve this question




















  • 1




    There is a mistake in the anticommutativity part for subtraction
    – Slugger
    Jan 4 at 19:15






  • 7




    Have you heard of groups?
    – jgon
    Jan 4 at 19:16










  • @Slugger Sorry for the mistake and thanks a lot for the fix!
    – ismael
    Jan 4 at 19:19






  • 2




    @ismael No problem! You also say that "the only main difference between them is their identity elements (0 and 1 respectively)", but keep in mind that another significant difference is that multiplicative inverses are not defined for all numbers, i.e., for zero
    – Slugger
    Jan 4 at 19:21








  • 1




    The title might be even better with a minor change to "What makes the pairs of operators $(-, +)$ and $(div , times)$ so similar?" so they also look similar
    – Henry
    Jan 4 at 22:48














12












12








12


6





Update



My original thoughts are better expressed on this mathoverflow post.



Short Version



When defining the $-$, $+$, $÷$, and $×$ operators in a functional manner, one can observe that the $(-, +)$ pair is very similar to the $(÷, ×)$ pair, and the only main differences between them is their identity terms ($0$ and $1$ respectively) and the fact that the divisor cannot be equal to the identity term of the $(-, +)$ pair of operators.



My questions are the following: where can I find some prior work on this topic, and can one define a family of such operator pairs with different identity terms? Is there any theory for such objects?



Set Theory Version



While the arithmetic properties outlined below can be defined for both sets and types, referring to set theory might help clarify the question: if $(+, -)$ with 0 as identity element defines a group and $[(+, -), (×, ÷)]$ with 1 as identity element for $(×, ÷)$ defines a field, what is defined by $[(+, -), (×, ÷), (#, @)]$ with an identity element for $(#, @)$ other than 0 and 1?



Since $(+, -, 0)$ is used to define $mathbb{Z}$ and $[(+, -, 0), (×, ÷, 1)]$ is used to define $mathbb{Q}$, which $(#, @, r)$ could be introduced so that $[(+, -, 0), (×, ÷, 1), (#, @, r)]$ would define $mathbb{S}$, with $mathbb{Q} subset mathbb{S} subseteq mathbb{R}$?



Intuitively, $#$ should be based on exponentiation, while $@$ should be based on logarithm.



Long Version



One can define the $-$, $+$, $÷$, and $×$ operators in the following fashion:



Minus:




$
small text{Minus Identity Term: the minus identity term is equal to 0.}normalsize\
i(m) = 0.\
quad\
small text{Subtraction Identity:} enspace alpha - 0 = alpha.normalsize\
m(alpha, i(m)) = alpha.\
quad\
small text{Self Subtraction:} enspace alpha = beta Longleftrightarrow alpha - beta = 0.normalsize\
alpha = beta Longleftrightarrow m(alpha, beta) = i(m).\
quad\
small text{Subtraction Affine Identity:} enspace alpha - (beta - gamma) = gamma - (beta - alpha).normalsize\
m(alpha, m(beta, gamma)) = m(gamma, m(beta, alpha)).\
$




Plus:




$
small text{Multiplication Affine Identity:} enspace (alpha + beta) - gamma = alpha - (gamma - beta).normalsize\
m(p(alpha, beta), gamma) = m(alpha, m(gamma, beta)).\
$




Divides:




$
small text{Divides Identity Term: the divides identity term is equal to 1.} normalsize\
i(d) = 1.\
quad\
small text{Division Identity:} enspace frac{alpha}{1} = alpha.normalsize\
d(alpha, i(d)) = alpha.\
quad\
small text{Self Division:} enspace frac{alpha}{alpha} = 1.normalsize\
alpha = beta Longleftrightarrow d(alpha, beta) = i(d).\
quad\
small text{Division Affine Identity:} enspace frac{alpha}{frac{beta}{gamma}} = frac{gamma}{frac{beta}{alpha}}.normalsize\
d(alpha, d(beta, gamma)) = d(gamma, d(beta, alpha)).\
$




Times:




$
small text{Multiplication Affine Identity:} enspace frac{alpha × beta}{gamma} = frac{alpha}{frac{gamma}{beta}}.normalsize\
d(t(alpha, beta), gamma) = d(alpha, d(gamma, beta)).\
$




We observe that the pair of divides and times operators are defined exactly the same way as the pair of minus and plus operators, but with different identity terms, and with a minus identity restriction on the multiplier subdomain of the divides function.



The symmetry established between the pairs of operators $(-, +)$ and $(÷, ×)$ allows the following pairs of properties to be proven for both properties in every pair by proving it for a single property.



The following properties are established for any pair of operator functions $(f, g)$, which corresponds to the pairs $(-, +)$ and $(÷, ×)$. Furthermore, the term reverse is used to refer to the opposite for the $(-, +)$ pair and to the inverse for the $(÷, ×)$ pair.



Proofs for the $(-, +)$ pair can be found on this notebook.



Anticommutativity: $f(alpha, beta) = f(i(f), f(beta, alpha).$




$
alpha - beta = -(beta - alpha).\
quad\
displaystyle frac{alpha}{beta} = frac{1}{frac{beta}{alpha}}.\
$




Double Reverse Identity: $alpha = f(i(f), f(i(f), alpha)).$




$
alpha = -(-alpha).\
quad\
displaystyle alpha = frac{1}{frac{1}{alpha}}.\
$




Associative Commutativity: $f(f(alpha, beta), gamma) = f(f(alpha, gamma), beta).$




$
(alpha - beta) - gamma = (alpha - gamma) - beta.\
quad\
displaystyle frac{frac{alpha}{beta}}{gamma} = frac{frac{alpha}{gamma}}{beta}.\
$




Affine Equivalence: $f(alpha, beta) = gamma Longleftrightarrow f(alpha, gamma) = beta.$




$
alpha - beta = gamma Longleftrightarrow alpha - gamma = beta.\
quad\
displaystyle frac{alpha}{beta} = gamma Longleftrightarrow frac{alpha}{gamma} = beta.\
$




Identity Element: $g(alpha, i(f)) = alpha.$




$
alpha + 0 = alpha.\
quad\
alpha × 1 = alpha.\
$




Dual Substitution: $g(alpha, beta) = f(alpha, f(i(f), beta)).$




$
alpha + beta = alpha - (-beta).\
quad\
alpha × beta = frac{alpha}{frac{1}{beta}}.\
$




Dual Equivalence: $alpha = g(beta, gamma) Longleftrightarrow beta = f(alpha, gamma).$




$
alpha = beta + gamma Longleftrightarrow beta = alpha - gamma.\
quad\
alpha = beta × gamma Longleftrightarrow beta = frac{alpha}{gamma}.\
$




Commutativity: $g(alpha, beta) = g(beta, alpha).$




$
alpha + beta = beta + alpha.\
quad\
alpha × beta = beta × alpha.\
$




Associativity: $g(g(alpha, beta), gamma) = g(alpha, g(beta, gamma)).$




$
(alpha + beta) + gamma = alpha + (beta + gamma).\
quad\
(alpha × beta) × gamma = alpha × (beta × gamma).\
$




Dual Identity: $(g(f(alpha, beta), beta) = alpha) land (f(g(alpha, beta), beta) = alpha).$




$
((alpha - beta) + beta = alpha) land ((alpha + beta) - beta = alpha).\
quad\
displaystyle (frac{alpha}{gamma} × beta = alpha) land (frac{alpha × beta}{beta} = alpha).\
$











share|cite|improve this question















Update



My original thoughts are better expressed on this mathoverflow post.



Short Version



When defining the $-$, $+$, $÷$, and $×$ operators in a functional manner, one can observe that the $(-, +)$ pair is very similar to the $(÷, ×)$ pair, and the only main differences between them is their identity terms ($0$ and $1$ respectively) and the fact that the divisor cannot be equal to the identity term of the $(-, +)$ pair of operators.



My questions are the following: where can I find some prior work on this topic, and can one define a family of such operator pairs with different identity terms? Is there any theory for such objects?



Set Theory Version



While the arithmetic properties outlined below can be defined for both sets and types, referring to set theory might help clarify the question: if $(+, -)$ with 0 as identity element defines a group and $[(+, -), (×, ÷)]$ with 1 as identity element for $(×, ÷)$ defines a field, what is defined by $[(+, -), (×, ÷), (#, @)]$ with an identity element for $(#, @)$ other than 0 and 1?



Since $(+, -, 0)$ is used to define $mathbb{Z}$ and $[(+, -, 0), (×, ÷, 1)]$ is used to define $mathbb{Q}$, which $(#, @, r)$ could be introduced so that $[(+, -, 0), (×, ÷, 1), (#, @, r)]$ would define $mathbb{S}$, with $mathbb{Q} subset mathbb{S} subseteq mathbb{R}$?



Intuitively, $#$ should be based on exponentiation, while $@$ should be based on logarithm.



Long Version



One can define the $-$, $+$, $÷$, and $×$ operators in the following fashion:



Minus:




$
small text{Minus Identity Term: the minus identity term is equal to 0.}normalsize\
i(m) = 0.\
quad\
small text{Subtraction Identity:} enspace alpha - 0 = alpha.normalsize\
m(alpha, i(m)) = alpha.\
quad\
small text{Self Subtraction:} enspace alpha = beta Longleftrightarrow alpha - beta = 0.normalsize\
alpha = beta Longleftrightarrow m(alpha, beta) = i(m).\
quad\
small text{Subtraction Affine Identity:} enspace alpha - (beta - gamma) = gamma - (beta - alpha).normalsize\
m(alpha, m(beta, gamma)) = m(gamma, m(beta, alpha)).\
$




Plus:




$
small text{Multiplication Affine Identity:} enspace (alpha + beta) - gamma = alpha - (gamma - beta).normalsize\
m(p(alpha, beta), gamma) = m(alpha, m(gamma, beta)).\
$




Divides:




$
small text{Divides Identity Term: the divides identity term is equal to 1.} normalsize\
i(d) = 1.\
quad\
small text{Division Identity:} enspace frac{alpha}{1} = alpha.normalsize\
d(alpha, i(d)) = alpha.\
quad\
small text{Self Division:} enspace frac{alpha}{alpha} = 1.normalsize\
alpha = beta Longleftrightarrow d(alpha, beta) = i(d).\
quad\
small text{Division Affine Identity:} enspace frac{alpha}{frac{beta}{gamma}} = frac{gamma}{frac{beta}{alpha}}.normalsize\
d(alpha, d(beta, gamma)) = d(gamma, d(beta, alpha)).\
$




Times:




$
small text{Multiplication Affine Identity:} enspace frac{alpha × beta}{gamma} = frac{alpha}{frac{gamma}{beta}}.normalsize\
d(t(alpha, beta), gamma) = d(alpha, d(gamma, beta)).\
$




We observe that the pair of divides and times operators are defined exactly the same way as the pair of minus and plus operators, but with different identity terms, and with a minus identity restriction on the multiplier subdomain of the divides function.



The symmetry established between the pairs of operators $(-, +)$ and $(÷, ×)$ allows the following pairs of properties to be proven for both properties in every pair by proving it for a single property.



The following properties are established for any pair of operator functions $(f, g)$, which corresponds to the pairs $(-, +)$ and $(÷, ×)$. Furthermore, the term reverse is used to refer to the opposite for the $(-, +)$ pair and to the inverse for the $(÷, ×)$ pair.



Proofs for the $(-, +)$ pair can be found on this notebook.



Anticommutativity: $f(alpha, beta) = f(i(f), f(beta, alpha).$




$
alpha - beta = -(beta - alpha).\
quad\
displaystyle frac{alpha}{beta} = frac{1}{frac{beta}{alpha}}.\
$




Double Reverse Identity: $alpha = f(i(f), f(i(f), alpha)).$




$
alpha = -(-alpha).\
quad\
displaystyle alpha = frac{1}{frac{1}{alpha}}.\
$




Associative Commutativity: $f(f(alpha, beta), gamma) = f(f(alpha, gamma), beta).$




$
(alpha - beta) - gamma = (alpha - gamma) - beta.\
quad\
displaystyle frac{frac{alpha}{beta}}{gamma} = frac{frac{alpha}{gamma}}{beta}.\
$




Affine Equivalence: $f(alpha, beta) = gamma Longleftrightarrow f(alpha, gamma) = beta.$




$
alpha - beta = gamma Longleftrightarrow alpha - gamma = beta.\
quad\
displaystyle frac{alpha}{beta} = gamma Longleftrightarrow frac{alpha}{gamma} = beta.\
$




Identity Element: $g(alpha, i(f)) = alpha.$




$
alpha + 0 = alpha.\
quad\
alpha × 1 = alpha.\
$




Dual Substitution: $g(alpha, beta) = f(alpha, f(i(f), beta)).$




$
alpha + beta = alpha - (-beta).\
quad\
alpha × beta = frac{alpha}{frac{1}{beta}}.\
$




Dual Equivalence: $alpha = g(beta, gamma) Longleftrightarrow beta = f(alpha, gamma).$




$
alpha = beta + gamma Longleftrightarrow beta = alpha - gamma.\
quad\
alpha = beta × gamma Longleftrightarrow beta = frac{alpha}{gamma}.\
$




Commutativity: $g(alpha, beta) = g(beta, alpha).$




$
alpha + beta = beta + alpha.\
quad\
alpha × beta = beta × alpha.\
$




Associativity: $g(g(alpha, beta), gamma) = g(alpha, g(beta, gamma)).$




$
(alpha + beta) + gamma = alpha + (beta + gamma).\
quad\
(alpha × beta) × gamma = alpha × (beta × gamma).\
$




Dual Identity: $(g(f(alpha, beta), beta) = alpha) land (f(g(alpha, beta), beta) = alpha).$




$
((alpha - beta) + beta = alpha) land ((alpha + beta) - beta = alpha).\
quad\
displaystyle (frac{alpha}{gamma} × beta = alpha) land (frac{alpha × beta}{beta} = alpha).\
$








soft-question arithmetic






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited 2 days ago







ismael

















asked Jan 4 at 19:12









ismaelismael

274213




274213








  • 1




    There is a mistake in the anticommutativity part for subtraction
    – Slugger
    Jan 4 at 19:15






  • 7




    Have you heard of groups?
    – jgon
    Jan 4 at 19:16










  • @Slugger Sorry for the mistake and thanks a lot for the fix!
    – ismael
    Jan 4 at 19:19






  • 2




    @ismael No problem! You also say that "the only main difference between them is their identity elements (0 and 1 respectively)", but keep in mind that another significant difference is that multiplicative inverses are not defined for all numbers, i.e., for zero
    – Slugger
    Jan 4 at 19:21








  • 1




    The title might be even better with a minor change to "What makes the pairs of operators $(-, +)$ and $(div , times)$ so similar?" so they also look similar
    – Henry
    Jan 4 at 22:48














  • 1




    There is a mistake in the anticommutativity part for subtraction
    – Slugger
    Jan 4 at 19:15






  • 7




    Have you heard of groups?
    – jgon
    Jan 4 at 19:16










  • @Slugger Sorry for the mistake and thanks a lot for the fix!
    – ismael
    Jan 4 at 19:19






  • 2




    @ismael No problem! You also say that "the only main difference between them is their identity elements (0 and 1 respectively)", but keep in mind that another significant difference is that multiplicative inverses are not defined for all numbers, i.e., for zero
    – Slugger
    Jan 4 at 19:21








  • 1




    The title might be even better with a minor change to "What makes the pairs of operators $(-, +)$ and $(div , times)$ so similar?" so they also look similar
    – Henry
    Jan 4 at 22:48








1




1




There is a mistake in the anticommutativity part for subtraction
– Slugger
Jan 4 at 19:15




There is a mistake in the anticommutativity part for subtraction
– Slugger
Jan 4 at 19:15




7




7




Have you heard of groups?
– jgon
Jan 4 at 19:16




Have you heard of groups?
– jgon
Jan 4 at 19:16












@Slugger Sorry for the mistake and thanks a lot for the fix!
– ismael
Jan 4 at 19:19




@Slugger Sorry for the mistake and thanks a lot for the fix!
– ismael
Jan 4 at 19:19




2




2




@ismael No problem! You also say that "the only main difference between them is their identity elements (0 and 1 respectively)", but keep in mind that another significant difference is that multiplicative inverses are not defined for all numbers, i.e., for zero
– Slugger
Jan 4 at 19:21






@ismael No problem! You also say that "the only main difference between them is their identity elements (0 and 1 respectively)", but keep in mind that another significant difference is that multiplicative inverses are not defined for all numbers, i.e., for zero
– Slugger
Jan 4 at 19:21






1




1




The title might be even better with a minor change to "What makes the pairs of operators $(-, +)$ and $(div , times)$ so similar?" so they also look similar
– Henry
Jan 4 at 22:48




The title might be even better with a minor change to "What makes the pairs of operators $(-, +)$ and $(div , times)$ so similar?" so they also look similar
– Henry
Jan 4 at 22:48










7 Answers
7






active

oldest

votes


















11














What you're talking about is called a Field.



A Field is a set (say rational number $mathbb{Q}$, real numbers $mathbb{R}$, complex numbers $mathbb{C}$, etc...) together with two operations $(+,times)$ such that the following axioms holds:



The operations are associative: $a + (b + c) = (a + b) + c$ and $a cdot (b cdot c) = (a cdot b) $



The operations are commutative: $a+b=b+a$ and $acdot b=bcdot a$



Each of the operations has it's own identity element $(0,1)$. Formally, there exist two different elements $0$ and $1$ such that $a + 0 = a$ and $a · 1 = a$.



And each of the operations admits an "inverse" (i.e. we have $(-,/)$). That is,



For every $a$, there exists an element denoted $−a$, such that $a + (−a) = 0$. Similarly for every $anot = 0$ there exists an element, often denoted by $a^{-1}$ or $1/a$ such that $acdot a^{-1}=1$.



Finally there is one more axiom which associate between the additive and multiplicative notion. It's called the distributivity and it says that $a cdot (b + c) = (a cdot b) + (a cdot c)$.



We have many fields some of them are finite and some are infinite. In my opinion the best example for a finite field would be $mathbb{F}_p$ - the field of $p$-elements with addition and multiplication modulo $p$, you can read about it and more finite fields here. The most useful infinite fields (Again in my opinion) are the rational numbers, the real numbers and the complex numbers with the usual addition and multiplication. The important part however is that and every field satisfies all of the properties that you mentioned in your question.



Note that I removed quantifiers from the definitions to make them simpler, for the complete and correct axioms of a field please click the link in the first line.






share|cite|improve this answer























  • Yes indeed, if you apply these properties to set theory, you get a field, but I am trying to study these properties without having to rely on the axioms of set theory. Also, I am looking for a theory that would provide an extensive analysis of families of operators defined with different values for the identity elements. Can values other than $0$ and $1$ be used? And if so, what kind of operators/functions do they introduce? And how do these functions relate to each other (in the same way that $e$ and $ln$ relate to each other)?
    – ismael
    Jan 4 at 19:38












  • @ismael About your first sentence, I really don't know anything mathematical that doesn't rely on the axioms of set theory. About the second sentence, we just denote the identity elements as $0$ and $1$, I guess you can give them different names but this would be unnecessary. I don't understand your last sentence, what kind of operators or functions do you expect that they will introduce? How are $e$ and $ln$ relevant here?
    – Yanko
    Jan 4 at 19:42










  • If you take a finitist approach and use type theory instead of set theory, you do not need the axioms of set theory. This makes it harder to work with $mathbb{R}$, but things go really smoothly with $mathbb{Q}$. As far as identity element (or should I say identity term?) values are concerned, I am calling $0$ the initial object of my underlying coinductive type and $1$ the successor of $0$, with $successor$ being my coinductive operator. Now, I wonder what the operator defined like minus and divides would look like with $-1$ as identity term.
    – ismael
    Jan 4 at 19:46












  • It is not at all clear what you are asking that isnt covered in any basic abstract algebra class. No one can give you an answer if you can not form an actual question.
    – fleablood
    Jan 4 at 19:50










  • @Yanko Using the terminology of set theory for clarity sake, if you have a group with (-, +) and a field with [(-, +), (/, ×)], what do you have when you add a new pair of operators (@, #) with @ defined using an element other than 0 and 1 as identity element (say -1 for example)? Is there a name for such an algebraic structure?
    – ismael
    Jan 4 at 19:50





















7














Since you're interested in type theory and say that you therefore want an element free perspective, I'll give you the categorical perspective.



In category theory, we can define group objects in a category $C$ with finite products (including the terminal object, $*$) as an object $G$ with $mu : Gtimes G to G$ (a binary operator), $e: * to G$ (a nullary operator), and $i : Gto G$ (a unary operator) satisfying the following relations, where $Delta_G : Gto Gtimes G$ is the diagonal map and $tau_G : Gto *$ is the map to the terminal object:



Associativity:
$$mucirc (mutimes newcommandid{operatorname{id}}id) = mucirc (idtimes mu) :Gtimes Gtimes G to G$$
Identity:
$$mucirc (idtimes e)=mucirc (e times id)=id : Gtimes G$$
Inverses:
$$mucirc (idtimes i) circ Delta_G = mucirc (itimes id) circ Delta_G = ecirc tau_G : Gto G$$



Now this axiomatization is equivalent to the axiomatization you've given in your question, except that instead of inversion, you've given division as the primitive operation.



To get your data, we define division as $d=mu circ (id times i)$.



Conversely, given division $d: Gtimes Gto G$, we define $i$ by $i=dcirc (etimes id)$.



Your axiomatization gives associativity and identity for free, plus also commutativity (so you're technically axiomatizing abelian groups).



Then your "dual identity" can be phrased $$mucirc (dtimes id) circ (id times Delta_G) = dcirc (mu times id)circ (id times Delta_G) = id times tau_G : Gtimes Gto G $$



Composing with $etimes id$ we get the identity
$$mucirc (dtimes id) circ (idtimes Delta_G) circ (etimes id) =
mucirc (dtimes id)circ (etimes idtimes id)circ Delta_G = mucirc (itimes id)circ Delta_G=etimes tau_G,$$

which is half of the inverses identity, and the other half we get is:
$$dcirc (mutimes id) circ (idtimes Delta_G) circ (etimes id) =
dcirc (mutimes id)circ (etimes idtimes id)circ Delta_G = dcirc Delta_G=ecirc tau_G,$$

so we just need to check $d = mucirc (id times i)$, and this follows from your
double reverse and dual substitution identities.
(We get $alpha + (-beta) = alpha - (-(-beta)) = alpha - beta$).



Conclusion



All of the properties you've listed follow from the fact that the operations you've chosen define abelian groups.



Thus the reason the triples of operators (don't forget the identity) are so similar is that they each define abelian groups.



Edit:



It's now a bit more clear to me what you're asking about. You also are interested in the relationship between these pairs/triples of operators, and how to possibly add another pair/triple.



In which case I feel the need to point out that fields don't come with two pairs of operations.



It's actually a bit easier to see this in the case of (commutative) rings.



For a general commutative ring $R$ define $a/b = acdot b^{-1}$ when $b$ is invertible.



Then the collection of all invertible elements of $R$, denoted $R^times$ forms a group, and it has identity $1$, the usual multiplication as multiplication, and the division just defined gives the division operation.



Now $R^times=R$, as sets only when $R=0$, the zero ring, since otherwise $0$ is never invertible. Thus the triple of operations $(1,*,/)$ is never actually a triple of operations on $R$, but rather a triple of operations on the related object $R^times$.



In the very special case of fields, $R^times = Rsetminus{0}$, but for say the integers, we have $Bbb{Z}^times = {1,-1}$.



Also there is an additional axiom relating the operations $+$ and $*$, the distributive law.



Thus it's not clear what you mean by adding another triple of operations.



The two triples of operations already discussed aren't defined on the same set/type to begin with, so it's not quite clear how you'd be adding a third.



Also even if you did construct a related type on which to define a third operation, this third operation should relate to the previous two in some way.



In mathematics, there are examples of rings with additional operations (though none that I can think of that form an abelian group), such as differential graded algebras, but the third operation always relates to the prior two in some way.






share|cite|improve this answer























  • Thank you so much for this explanation! I really need to dig into category theory (and probably homotopy type theory as well while I am at it). Also, sorry for the confusion regarding elements and terms. I have fixed it in the original post.
    – ismael
    Jan 4 at 20:24










  • @ismael I've edited in an additional section in response to your edits
    – jgon
    Jan 4 at 20:51










  • This is super helpful! Yes, the third operation should be defined in relation to the previous two, much like the second is defined in relation to the first. And I intentionally avoided talking about distributivity because I am still working on it, trying to figure out whether this has to be an axiom or whether it can be proven.
    – ismael
    Jan 4 at 21:16










  • Another way to reformulate my original question is this: since $(+, -, 0)$ is used to define $mathbb{Z}$ and $[(+, -, 0), (×, /, 1)]$ is used to define $mathbb{Q}$, which triplet $(#, @, r)$ could be introduced so that $[(+, -, 0), (×, /, 1), (#, @, r)]$ would define $mathbb{S}$, with $mathbb{Q} subset mathbb{S} subseteq mathbb{R}$?
    – ismael
    Jan 4 at 22:04






  • 1




    @ismael Part of my point though is that division isn't an operation on $Bbb{Q}$, since division is only defined on $Bbb{Q}timesBbb{Q}^times$. Also it's really too broad a question to ask what additional operation could be introduced without specifying how it interacts with the other two operations.
    – jgon
    Jan 4 at 22:21



















3














Update: A more detailed answer is available on this mathoverflow post.



Following @Henry’s suggestion, a recursive structure of abelian groups can be constructed by using commutative hyperoperations:



$p_{n+1}(a, b) = exp(p_n(ln(a), ln(b)))$



$p_0(a, b) = lnleft(e^a + e^bright)$



$p_1(a, b) = a + b$



$p_2(a, b) = acdot b = e^{ln(a) + ln(b)}$



$p_3(a, b) = a^{ln(b)} = e^{ln(a)ln(b)}$



$p_4(a, b) = e^{e^{ln(ln(a))ln(ln(b))}}$



These functions give us the sequence of $(+, ×, ...)$ operations, while their inverse functions give us the sequence of $(-, ÷, ...)$ dual operations. The sequence of identity terms is $(0, 1, e, ...)$. With this, $T_1$ (Type Level 1) is isomorphic to a group, $T_2$ is isormorphic to a field, and successive types give you more and more complex objects.



The identity terms are:



$i_n = e upuparrows (n - 2).$



$i_1 = 0.$



$i_2 = 1.$



$i_3 = e.$



$i_3 = e ^ e.$



$i_4 = e ^ {e ^ e}.$



While I cannot yet fathom what $T_4$ And successive types can be used for, I have to believe that $T_3$ is interesting, because it brings exponentiation to the table in a very natural manner. Therefore, stopping at the level of fields feels quite shortsighted.



Also, $T_1$ is isomorphic to $mathbb{Z}$ and $T_2$ is isomorphic to $mathbb{Q}$, but $T_3$ is isomorphic to a strict subset of $mathbb{R}$. This goes to suggest that the gap between $mathbb{Q}$ and $mathbb{R}$ is pretty large and should be filled incrementally with larger and larger sets. One interesting question is whether $T_n$ “converges” toward a structure that is isomorphic to $mathbb{R}$ when $n$ increases.






share|cite|improve this answer























  • The identity for $F_0$ seems to be $-infty$ so there are not inverses, while for $F_4$ is $e^e$ and for $F_n$ seems to be $euparrowuparrow(n-2)$
    – Henry
    2 days ago










  • That looks right to me. Now I need to write the inverse functions.
    – ismael
    2 days ago










  • @Henry I gave you credit on this follow-up mathoverflow post. I hope this is OK with you. If not, I am happy to remove it.
    – ismael
    2 days ago



















2














Here is my best attempt at answering this question; but the answer I have is likely to be disappointing.




If $(+, -)$ with 0 as identity element defines a group and $[(+, -), (×, /)]$ with 1 as identity element for $(×, /)$ defines a field, what is defined by $[(+, -), (×, /), (#, @)]$ with an identity element for $(#, @)$ other than 0 and 1?




As far as I know, no such thing has been studied any significant amount.



As you've noticed, any field has two corresponding groups: its additive group and its multiplicative group. These two groups have different identity elements.



I'm not aware of any kind of algebraic structure which has three corresponding groups. And nobody is going to study such things, or name them, until someone has found an interesting example of such a thing.






share|cite|improve this answer

















  • 1




    That’s a totally valid answer. Indeed, I do not like it, but it’s quite possibly the best answer so far.
    – ismael
    Jan 4 at 21:34










  • I guess @Henry just found one: $a # b = a ^{log(b)} = b ^{log(a)}$ and $a @ b = a ^ frac{1}{log(b)}$ with $e$ as identity term. Having this providing a third group built on top of the previous two is a more elegant way of stating that exponentiation is repeated multiplication. And it makes me wonder what comes next after 0, 1, and e...
    – ismael
    Jan 5 at 2:24





















1














$log$ turns multiplication and division into addition and subtraction. The precise statement is that $log: mathbb R^+ to mathbb R$ is a group isomorphism, whose inverse is $exp$.






share|cite|improve this answer

















  • 1




    Excellent point, but this is more a corollary rather than a justification. And the definition of such functions for the pair of identity elements $(0, 1)$ makes me wonder what functions could be defined with other pairs or additional operators defined with other identity elements like $-1$, $e$, or $pi$.
    – ismael
    Jan 4 at 19:28








  • 1




    The extension could then be that $a ,#, b = a^{log(b)} = b^{log(a)}$ on $mathbb R_{gt 1}$, so $a ,@, b = a^{1/log(b)}$ with an identity which is $e$ for natural logarithms, and then $a ,#, exp(1/log(a)) = e$ showing the inverse
    – Henry
    Jan 4 at 23:08










  • @Henry This is it! Beautiful! Thank you so very much.
    – ismael
    Jan 5 at 2:10










  • @Henry Could you make this an answer to the question so that It can be closed?
    – ismael
    Jan 5 at 2:10










  • @Henry With this result, we now have a very “natural” series 0, 1, e. This begs the following question: what comes next? I mean, this is too beautiful to be a simple coincidence...
    – ismael
    Jan 5 at 2:29



















1














How do you define these operations? If it's the primary-school way on real numbers, it follows from the facts that (i) reals form an Abelian group under $+$, its identity element famously named $0$, and (ii) reals $ne 0$ form an Abelian group under $times$. (Note this guarantees many similarities follow from group theory.) This, together with $atimes (b+c)=atimes b+atimes c$ (we say $times$ distributes over $+$), defines a field. Maths has a lot of groups, and a lot of fields; and where you have fields, you have two very similar operations.






share|cite|improve this answer





















  • I define them on top of a coinductive type as outlined in this notabook.
    – ismael
    Jan 4 at 19:34



















0














I think what's going on is this:



Suppose you have any binary relation $star$ making $X$ an abelian group. One way to express the relation is that it is a subset $Ssubseteq (Xtimes X)times X$ where $astar b=c$ iff $((a,b),c)in S$.



You can immediately form a new relation $S'={((c,a),b)mid (a,b,c)in S}$, and that describes a different binary operation. The fact that $S$ was formed from an abelian group operation allows you to say that this actually is a function.



And you can repeat this again to get $S''={((b,c),a)mid (a,b,c)in S}$, but it isn't as obvious that it is a function from it's origin from $S'$, but we can appeal again to $S$ again to prove it is a function.



Repeating the trick a third time gets you back to $S$.



If you take the special case where $star$ is addition, you'll find that $S'$ is subtraction where the thing subtracted is on the right, and $S''$ is like subtraction where the thing subtracted is on the left.



All this means, I think, is that the binary operations for some groups that we are all very familiar with can be translated to this new funky ordering, and because of the group properties contained in $S$, you will have a standard set of properties available in $S'$ (and also perhaps a slightly different set for $S''$, I didn't check).



My gut feeling is that the set of group axioms on $S'$ is equivalent in some sense to the abeilan group axioms encompassed in $S$, so that we really haven't learned anything new, really, we've just rewritten all the addition in terms of subtraction, and all the division in terms of multiplication. It does not feel like there is anything significant in this process.






share|cite|improve this answer





















  • Well, I am not so sure. First, I am not using set theory, I am using coinductive type theory, therefore some axioms of set theory are not necessary. Second, by defining subtraction before addition, I can deal with measures like temperatures that do not support addition (this is a very big deal for physicists and statisticians). Third, my real question is related to the values picked for the identity terms: what happens when these values are not $(0, 1)$? Or what happens when you add a third pair of operators with a third identity term (say -1)? Has anyone worked on this yet?
    – ismael
    Jan 4 at 19:58








  • 1




    I don't understand your first and third points. I think I understand your second point, and I'm suggesting that even though that is fine, it probably amounts to the same thing as addition in the end.
    – rschwieb
    Jan 4 at 20:00












  • I don’t think it does. While you can add a temperature delta to a temperature, you cannot add two temperature. This suggests that the addition operator should not have the cartesian product of the same set as domain, but the cartesian product of a type that does not support addition with a type that does support addition (I’m not sure that talking about cartesian product for types makes perfect sense, but this is a totally different subject). This kind of hybrid domain is not allowed by groups or fields unfortunately...
    – ismael
    Jan 4 at 20:02












  • @ismael I'm not sure I agree with your specific example, that you can't add temperatures, but it sounds like you're talking about the idea of an affine space. Also usually the Cartesian product of types is the type of pairs. E.g., in Haskell notation for algebraic data types, the cartesian product of a and b is Pair a b, where data Pair a b = Pair a b. (Although in Haskell, you would usually just use the built-in type (a,b)).
    – jgon
    Jan 4 at 20:57












  • @jgon You’re totally on point: I am trying to define my types in a more “affine” manner, hence the names I gave to some of the properties of their operators. Also, it seems to me that by using such “affine” expressions for the axiomatic definition of the types, you end up with less axioms than if you don’t, which I view as a plus. For example, in the case of the multiplication, it is defined with a single axiom, from which we can proove the identity term, associativity, and commutativity. These “affine” expressions are really quite powerful in that respect.
    – ismael
    Jan 4 at 21:18













Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3061985%2fwhat-makes-the-pairs-of-operators-and-%25c3%25b7-%25c3%2597-so-similar%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























7 Answers
7






active

oldest

votes








7 Answers
7






active

oldest

votes









active

oldest

votes






active

oldest

votes









11














What you're talking about is called a Field.



A Field is a set (say rational number $mathbb{Q}$, real numbers $mathbb{R}$, complex numbers $mathbb{C}$, etc...) together with two operations $(+,times)$ such that the following axioms holds:



The operations are associative: $a + (b + c) = (a + b) + c$ and $a cdot (b cdot c) = (a cdot b) $



The operations are commutative: $a+b=b+a$ and $acdot b=bcdot a$



Each of the operations has it's own identity element $(0,1)$. Formally, there exist two different elements $0$ and $1$ such that $a + 0 = a$ and $a · 1 = a$.



And each of the operations admits an "inverse" (i.e. we have $(-,/)$). That is,



For every $a$, there exists an element denoted $−a$, such that $a + (−a) = 0$. Similarly for every $anot = 0$ there exists an element, often denoted by $a^{-1}$ or $1/a$ such that $acdot a^{-1}=1$.



Finally there is one more axiom which associate between the additive and multiplicative notion. It's called the distributivity and it says that $a cdot (b + c) = (a cdot b) + (a cdot c)$.



We have many fields some of them are finite and some are infinite. In my opinion the best example for a finite field would be $mathbb{F}_p$ - the field of $p$-elements with addition and multiplication modulo $p$, you can read about it and more finite fields here. The most useful infinite fields (Again in my opinion) are the rational numbers, the real numbers and the complex numbers with the usual addition and multiplication. The important part however is that and every field satisfies all of the properties that you mentioned in your question.



Note that I removed quantifiers from the definitions to make them simpler, for the complete and correct axioms of a field please click the link in the first line.






share|cite|improve this answer























  • Yes indeed, if you apply these properties to set theory, you get a field, but I am trying to study these properties without having to rely on the axioms of set theory. Also, I am looking for a theory that would provide an extensive analysis of families of operators defined with different values for the identity elements. Can values other than $0$ and $1$ be used? And if so, what kind of operators/functions do they introduce? And how do these functions relate to each other (in the same way that $e$ and $ln$ relate to each other)?
    – ismael
    Jan 4 at 19:38












  • @ismael About your first sentence, I really don't know anything mathematical that doesn't rely on the axioms of set theory. About the second sentence, we just denote the identity elements as $0$ and $1$, I guess you can give them different names but this would be unnecessary. I don't understand your last sentence, what kind of operators or functions do you expect that they will introduce? How are $e$ and $ln$ relevant here?
    – Yanko
    Jan 4 at 19:42










  • If you take a finitist approach and use type theory instead of set theory, you do not need the axioms of set theory. This makes it harder to work with $mathbb{R}$, but things go really smoothly with $mathbb{Q}$. As far as identity element (or should I say identity term?) values are concerned, I am calling $0$ the initial object of my underlying coinductive type and $1$ the successor of $0$, with $successor$ being my coinductive operator. Now, I wonder what the operator defined like minus and divides would look like with $-1$ as identity term.
    – ismael
    Jan 4 at 19:46












  • It is not at all clear what you are asking that isnt covered in any basic abstract algebra class. No one can give you an answer if you can not form an actual question.
    – fleablood
    Jan 4 at 19:50










  • @Yanko Using the terminology of set theory for clarity sake, if you have a group with (-, +) and a field with [(-, +), (/, ×)], what do you have when you add a new pair of operators (@, #) with @ defined using an element other than 0 and 1 as identity element (say -1 for example)? Is there a name for such an algebraic structure?
    – ismael
    Jan 4 at 19:50


















11














What you're talking about is called a Field.



A Field is a set (say rational number $mathbb{Q}$, real numbers $mathbb{R}$, complex numbers $mathbb{C}$, etc...) together with two operations $(+,times)$ such that the following axioms holds:



The operations are associative: $a + (b + c) = (a + b) + c$ and $a cdot (b cdot c) = (a cdot b) $



The operations are commutative: $a+b=b+a$ and $acdot b=bcdot a$



Each of the operations has it's own identity element $(0,1)$. Formally, there exist two different elements $0$ and $1$ such that $a + 0 = a$ and $a · 1 = a$.



And each of the operations admits an "inverse" (i.e. we have $(-,/)$). That is,



For every $a$, there exists an element denoted $−a$, such that $a + (−a) = 0$. Similarly for every $anot = 0$ there exists an element, often denoted by $a^{-1}$ or $1/a$ such that $acdot a^{-1}=1$.



Finally there is one more axiom which associate between the additive and multiplicative notion. It's called the distributivity and it says that $a cdot (b + c) = (a cdot b) + (a cdot c)$.



We have many fields some of them are finite and some are infinite. In my opinion the best example for a finite field would be $mathbb{F}_p$ - the field of $p$-elements with addition and multiplication modulo $p$, you can read about it and more finite fields here. The most useful infinite fields (Again in my opinion) are the rational numbers, the real numbers and the complex numbers with the usual addition and multiplication. The important part however is that and every field satisfies all of the properties that you mentioned in your question.



Note that I removed quantifiers from the definitions to make them simpler, for the complete and correct axioms of a field please click the link in the first line.






share|cite|improve this answer























  • Yes indeed, if you apply these properties to set theory, you get a field, but I am trying to study these properties without having to rely on the axioms of set theory. Also, I am looking for a theory that would provide an extensive analysis of families of operators defined with different values for the identity elements. Can values other than $0$ and $1$ be used? And if so, what kind of operators/functions do they introduce? And how do these functions relate to each other (in the same way that $e$ and $ln$ relate to each other)?
    – ismael
    Jan 4 at 19:38












  • @ismael About your first sentence, I really don't know anything mathematical that doesn't rely on the axioms of set theory. About the second sentence, we just denote the identity elements as $0$ and $1$, I guess you can give them different names but this would be unnecessary. I don't understand your last sentence, what kind of operators or functions do you expect that they will introduce? How are $e$ and $ln$ relevant here?
    – Yanko
    Jan 4 at 19:42










  • If you take a finitist approach and use type theory instead of set theory, you do not need the axioms of set theory. This makes it harder to work with $mathbb{R}$, but things go really smoothly with $mathbb{Q}$. As far as identity element (or should I say identity term?) values are concerned, I am calling $0$ the initial object of my underlying coinductive type and $1$ the successor of $0$, with $successor$ being my coinductive operator. Now, I wonder what the operator defined like minus and divides would look like with $-1$ as identity term.
    – ismael
    Jan 4 at 19:46












  • It is not at all clear what you are asking that isnt covered in any basic abstract algebra class. No one can give you an answer if you can not form an actual question.
    – fleablood
    Jan 4 at 19:50










  • @Yanko Using the terminology of set theory for clarity sake, if you have a group with (-, +) and a field with [(-, +), (/, ×)], what do you have when you add a new pair of operators (@, #) with @ defined using an element other than 0 and 1 as identity element (say -1 for example)? Is there a name for such an algebraic structure?
    – ismael
    Jan 4 at 19:50
















11












11








11






What you're talking about is called a Field.



A Field is a set (say rational number $mathbb{Q}$, real numbers $mathbb{R}$, complex numbers $mathbb{C}$, etc...) together with two operations $(+,times)$ such that the following axioms holds:



The operations are associative: $a + (b + c) = (a + b) + c$ and $a cdot (b cdot c) = (a cdot b) $



The operations are commutative: $a+b=b+a$ and $acdot b=bcdot a$



Each of the operations has it's own identity element $(0,1)$. Formally, there exist two different elements $0$ and $1$ such that $a + 0 = a$ and $a · 1 = a$.



And each of the operations admits an "inverse" (i.e. we have $(-,/)$). That is,



For every $a$, there exists an element denoted $−a$, such that $a + (−a) = 0$. Similarly for every $anot = 0$ there exists an element, often denoted by $a^{-1}$ or $1/a$ such that $acdot a^{-1}=1$.



Finally there is one more axiom which associate between the additive and multiplicative notion. It's called the distributivity and it says that $a cdot (b + c) = (a cdot b) + (a cdot c)$.



We have many fields some of them are finite and some are infinite. In my opinion the best example for a finite field would be $mathbb{F}_p$ - the field of $p$-elements with addition and multiplication modulo $p$, you can read about it and more finite fields here. The most useful infinite fields (Again in my opinion) are the rational numbers, the real numbers and the complex numbers with the usual addition and multiplication. The important part however is that and every field satisfies all of the properties that you mentioned in your question.



Note that I removed quantifiers from the definitions to make them simpler, for the complete and correct axioms of a field please click the link in the first line.






share|cite|improve this answer














What you're talking about is called a Field.



A Field is a set (say rational number $mathbb{Q}$, real numbers $mathbb{R}$, complex numbers $mathbb{C}$, etc...) together with two operations $(+,times)$ such that the following axioms holds:



The operations are associative: $a + (b + c) = (a + b) + c$ and $a cdot (b cdot c) = (a cdot b) $



The operations are commutative: $a+b=b+a$ and $acdot b=bcdot a$



Each of the operations has it's own identity element $(0,1)$. Formally, there exist two different elements $0$ and $1$ such that $a + 0 = a$ and $a · 1 = a$.



And each of the operations admits an "inverse" (i.e. we have $(-,/)$). That is,



For every $a$, there exists an element denoted $−a$, such that $a + (−a) = 0$. Similarly for every $anot = 0$ there exists an element, often denoted by $a^{-1}$ or $1/a$ such that $acdot a^{-1}=1$.



Finally there is one more axiom which associate between the additive and multiplicative notion. It's called the distributivity and it says that $a cdot (b + c) = (a cdot b) + (a cdot c)$.



We have many fields some of them are finite and some are infinite. In my opinion the best example for a finite field would be $mathbb{F}_p$ - the field of $p$-elements with addition and multiplication modulo $p$, you can read about it and more finite fields here. The most useful infinite fields (Again in my opinion) are the rational numbers, the real numbers and the complex numbers with the usual addition and multiplication. The important part however is that and every field satisfies all of the properties that you mentioned in your question.



Note that I removed quantifiers from the definitions to make them simpler, for the complete and correct axioms of a field please click the link in the first line.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Jan 4 at 19:35

























answered Jan 4 at 19:28









YankoYanko

6,469727




6,469727












  • Yes indeed, if you apply these properties to set theory, you get a field, but I am trying to study these properties without having to rely on the axioms of set theory. Also, I am looking for a theory that would provide an extensive analysis of families of operators defined with different values for the identity elements. Can values other than $0$ and $1$ be used? And if so, what kind of operators/functions do they introduce? And how do these functions relate to each other (in the same way that $e$ and $ln$ relate to each other)?
    – ismael
    Jan 4 at 19:38












  • @ismael About your first sentence, I really don't know anything mathematical that doesn't rely on the axioms of set theory. About the second sentence, we just denote the identity elements as $0$ and $1$, I guess you can give them different names but this would be unnecessary. I don't understand your last sentence, what kind of operators or functions do you expect that they will introduce? How are $e$ and $ln$ relevant here?
    – Yanko
    Jan 4 at 19:42










  • If you take a finitist approach and use type theory instead of set theory, you do not need the axioms of set theory. This makes it harder to work with $mathbb{R}$, but things go really smoothly with $mathbb{Q}$. As far as identity element (or should I say identity term?) values are concerned, I am calling $0$ the initial object of my underlying coinductive type and $1$ the successor of $0$, with $successor$ being my coinductive operator. Now, I wonder what the operator defined like minus and divides would look like with $-1$ as identity term.
    – ismael
    Jan 4 at 19:46












  • It is not at all clear what you are asking that isnt covered in any basic abstract algebra class. No one can give you an answer if you can not form an actual question.
    – fleablood
    Jan 4 at 19:50










  • @Yanko Using the terminology of set theory for clarity sake, if you have a group with (-, +) and a field with [(-, +), (/, ×)], what do you have when you add a new pair of operators (@, #) with @ defined using an element other than 0 and 1 as identity element (say -1 for example)? Is there a name for such an algebraic structure?
    – ismael
    Jan 4 at 19:50




















  • Yes indeed, if you apply these properties to set theory, you get a field, but I am trying to study these properties without having to rely on the axioms of set theory. Also, I am looking for a theory that would provide an extensive analysis of families of operators defined with different values for the identity elements. Can values other than $0$ and $1$ be used? And if so, what kind of operators/functions do they introduce? And how do these functions relate to each other (in the same way that $e$ and $ln$ relate to each other)?
    – ismael
    Jan 4 at 19:38












  • @ismael About your first sentence, I really don't know anything mathematical that doesn't rely on the axioms of set theory. About the second sentence, we just denote the identity elements as $0$ and $1$, I guess you can give them different names but this would be unnecessary. I don't understand your last sentence, what kind of operators or functions do you expect that they will introduce? How are $e$ and $ln$ relevant here?
    – Yanko
    Jan 4 at 19:42










  • If you take a finitist approach and use type theory instead of set theory, you do not need the axioms of set theory. This makes it harder to work with $mathbb{R}$, but things go really smoothly with $mathbb{Q}$. As far as identity element (or should I say identity term?) values are concerned, I am calling $0$ the initial object of my underlying coinductive type and $1$ the successor of $0$, with $successor$ being my coinductive operator. Now, I wonder what the operator defined like minus and divides would look like with $-1$ as identity term.
    – ismael
    Jan 4 at 19:46












  • It is not at all clear what you are asking that isnt covered in any basic abstract algebra class. No one can give you an answer if you can not form an actual question.
    – fleablood
    Jan 4 at 19:50










  • @Yanko Using the terminology of set theory for clarity sake, if you have a group with (-, +) and a field with [(-, +), (/, ×)], what do you have when you add a new pair of operators (@, #) with @ defined using an element other than 0 and 1 as identity element (say -1 for example)? Is there a name for such an algebraic structure?
    – ismael
    Jan 4 at 19:50


















Yes indeed, if you apply these properties to set theory, you get a field, but I am trying to study these properties without having to rely on the axioms of set theory. Also, I am looking for a theory that would provide an extensive analysis of families of operators defined with different values for the identity elements. Can values other than $0$ and $1$ be used? And if so, what kind of operators/functions do they introduce? And how do these functions relate to each other (in the same way that $e$ and $ln$ relate to each other)?
– ismael
Jan 4 at 19:38






Yes indeed, if you apply these properties to set theory, you get a field, but I am trying to study these properties without having to rely on the axioms of set theory. Also, I am looking for a theory that would provide an extensive analysis of families of operators defined with different values for the identity elements. Can values other than $0$ and $1$ be used? And if so, what kind of operators/functions do they introduce? And how do these functions relate to each other (in the same way that $e$ and $ln$ relate to each other)?
– ismael
Jan 4 at 19:38














@ismael About your first sentence, I really don't know anything mathematical that doesn't rely on the axioms of set theory. About the second sentence, we just denote the identity elements as $0$ and $1$, I guess you can give them different names but this would be unnecessary. I don't understand your last sentence, what kind of operators or functions do you expect that they will introduce? How are $e$ and $ln$ relevant here?
– Yanko
Jan 4 at 19:42




@ismael About your first sentence, I really don't know anything mathematical that doesn't rely on the axioms of set theory. About the second sentence, we just denote the identity elements as $0$ and $1$, I guess you can give them different names but this would be unnecessary. I don't understand your last sentence, what kind of operators or functions do you expect that they will introduce? How are $e$ and $ln$ relevant here?
– Yanko
Jan 4 at 19:42












If you take a finitist approach and use type theory instead of set theory, you do not need the axioms of set theory. This makes it harder to work with $mathbb{R}$, but things go really smoothly with $mathbb{Q}$. As far as identity element (or should I say identity term?) values are concerned, I am calling $0$ the initial object of my underlying coinductive type and $1$ the successor of $0$, with $successor$ being my coinductive operator. Now, I wonder what the operator defined like minus and divides would look like with $-1$ as identity term.
– ismael
Jan 4 at 19:46






If you take a finitist approach and use type theory instead of set theory, you do not need the axioms of set theory. This makes it harder to work with $mathbb{R}$, but things go really smoothly with $mathbb{Q}$. As far as identity element (or should I say identity term?) values are concerned, I am calling $0$ the initial object of my underlying coinductive type and $1$ the successor of $0$, with $successor$ being my coinductive operator. Now, I wonder what the operator defined like minus and divides would look like with $-1$ as identity term.
– ismael
Jan 4 at 19:46














It is not at all clear what you are asking that isnt covered in any basic abstract algebra class. No one can give you an answer if you can not form an actual question.
– fleablood
Jan 4 at 19:50




It is not at all clear what you are asking that isnt covered in any basic abstract algebra class. No one can give you an answer if you can not form an actual question.
– fleablood
Jan 4 at 19:50












@Yanko Using the terminology of set theory for clarity sake, if you have a group with (-, +) and a field with [(-, +), (/, ×)], what do you have when you add a new pair of operators (@, #) with @ defined using an element other than 0 and 1 as identity element (say -1 for example)? Is there a name for such an algebraic structure?
– ismael
Jan 4 at 19:50






@Yanko Using the terminology of set theory for clarity sake, if you have a group with (-, +) and a field with [(-, +), (/, ×)], what do you have when you add a new pair of operators (@, #) with @ defined using an element other than 0 and 1 as identity element (say -1 for example)? Is there a name for such an algebraic structure?
– ismael
Jan 4 at 19:50













7














Since you're interested in type theory and say that you therefore want an element free perspective, I'll give you the categorical perspective.



In category theory, we can define group objects in a category $C$ with finite products (including the terminal object, $*$) as an object $G$ with $mu : Gtimes G to G$ (a binary operator), $e: * to G$ (a nullary operator), and $i : Gto G$ (a unary operator) satisfying the following relations, where $Delta_G : Gto Gtimes G$ is the diagonal map and $tau_G : Gto *$ is the map to the terminal object:



Associativity:
$$mucirc (mutimes newcommandid{operatorname{id}}id) = mucirc (idtimes mu) :Gtimes Gtimes G to G$$
Identity:
$$mucirc (idtimes e)=mucirc (e times id)=id : Gtimes G$$
Inverses:
$$mucirc (idtimes i) circ Delta_G = mucirc (itimes id) circ Delta_G = ecirc tau_G : Gto G$$



Now this axiomatization is equivalent to the axiomatization you've given in your question, except that instead of inversion, you've given division as the primitive operation.



To get your data, we define division as $d=mu circ (id times i)$.



Conversely, given division $d: Gtimes Gto G$, we define $i$ by $i=dcirc (etimes id)$.



Your axiomatization gives associativity and identity for free, plus also commutativity (so you're technically axiomatizing abelian groups).



Then your "dual identity" can be phrased $$mucirc (dtimes id) circ (id times Delta_G) = dcirc (mu times id)circ (id times Delta_G) = id times tau_G : Gtimes Gto G $$



Composing with $etimes id$ we get the identity
$$mucirc (dtimes id) circ (idtimes Delta_G) circ (etimes id) =
mucirc (dtimes id)circ (etimes idtimes id)circ Delta_G = mucirc (itimes id)circ Delta_G=etimes tau_G,$$

which is half of the inverses identity, and the other half we get is:
$$dcirc (mutimes id) circ (idtimes Delta_G) circ (etimes id) =
dcirc (mutimes id)circ (etimes idtimes id)circ Delta_G = dcirc Delta_G=ecirc tau_G,$$

so we just need to check $d = mucirc (id times i)$, and this follows from your
double reverse and dual substitution identities.
(We get $alpha + (-beta) = alpha - (-(-beta)) = alpha - beta$).



Conclusion



All of the properties you've listed follow from the fact that the operations you've chosen define abelian groups.



Thus the reason the triples of operators (don't forget the identity) are so similar is that they each define abelian groups.



Edit:



It's now a bit more clear to me what you're asking about. You also are interested in the relationship between these pairs/triples of operators, and how to possibly add another pair/triple.



In which case I feel the need to point out that fields don't come with two pairs of operations.



It's actually a bit easier to see this in the case of (commutative) rings.



For a general commutative ring $R$ define $a/b = acdot b^{-1}$ when $b$ is invertible.



Then the collection of all invertible elements of $R$, denoted $R^times$ forms a group, and it has identity $1$, the usual multiplication as multiplication, and the division just defined gives the division operation.



Now $R^times=R$, as sets only when $R=0$, the zero ring, since otherwise $0$ is never invertible. Thus the triple of operations $(1,*,/)$ is never actually a triple of operations on $R$, but rather a triple of operations on the related object $R^times$.



In the very special case of fields, $R^times = Rsetminus{0}$, but for say the integers, we have $Bbb{Z}^times = {1,-1}$.



Also there is an additional axiom relating the operations $+$ and $*$, the distributive law.



Thus it's not clear what you mean by adding another triple of operations.



The two triples of operations already discussed aren't defined on the same set/type to begin with, so it's not quite clear how you'd be adding a third.



Also even if you did construct a related type on which to define a third operation, this third operation should relate to the previous two in some way.



In mathematics, there are examples of rings with additional operations (though none that I can think of that form an abelian group), such as differential graded algebras, but the third operation always relates to the prior two in some way.






share|cite|improve this answer























  • Thank you so much for this explanation! I really need to dig into category theory (and probably homotopy type theory as well while I am at it). Also, sorry for the confusion regarding elements and terms. I have fixed it in the original post.
    – ismael
    Jan 4 at 20:24










  • @ismael I've edited in an additional section in response to your edits
    – jgon
    Jan 4 at 20:51










  • This is super helpful! Yes, the third operation should be defined in relation to the previous two, much like the second is defined in relation to the first. And I intentionally avoided talking about distributivity because I am still working on it, trying to figure out whether this has to be an axiom or whether it can be proven.
    – ismael
    Jan 4 at 21:16










  • Another way to reformulate my original question is this: since $(+, -, 0)$ is used to define $mathbb{Z}$ and $[(+, -, 0), (×, /, 1)]$ is used to define $mathbb{Q}$, which triplet $(#, @, r)$ could be introduced so that $[(+, -, 0), (×, /, 1), (#, @, r)]$ would define $mathbb{S}$, with $mathbb{Q} subset mathbb{S} subseteq mathbb{R}$?
    – ismael
    Jan 4 at 22:04






  • 1




    @ismael Part of my point though is that division isn't an operation on $Bbb{Q}$, since division is only defined on $Bbb{Q}timesBbb{Q}^times$. Also it's really too broad a question to ask what additional operation could be introduced without specifying how it interacts with the other two operations.
    – jgon
    Jan 4 at 22:21
















7














Since you're interested in type theory and say that you therefore want an element free perspective, I'll give you the categorical perspective.



In category theory, we can define group objects in a category $C$ with finite products (including the terminal object, $*$) as an object $G$ with $mu : Gtimes G to G$ (a binary operator), $e: * to G$ (a nullary operator), and $i : Gto G$ (a unary operator) satisfying the following relations, where $Delta_G : Gto Gtimes G$ is the diagonal map and $tau_G : Gto *$ is the map to the terminal object:



Associativity:
$$mucirc (mutimes newcommandid{operatorname{id}}id) = mucirc (idtimes mu) :Gtimes Gtimes G to G$$
Identity:
$$mucirc (idtimes e)=mucirc (e times id)=id : Gtimes G$$
Inverses:
$$mucirc (idtimes i) circ Delta_G = mucirc (itimes id) circ Delta_G = ecirc tau_G : Gto G$$



Now this axiomatization is equivalent to the axiomatization you've given in your question, except that instead of inversion, you've given division as the primitive operation.



To get your data, we define division as $d=mu circ (id times i)$.



Conversely, given division $d: Gtimes Gto G$, we define $i$ by $i=dcirc (etimes id)$.



Your axiomatization gives associativity and identity for free, plus also commutativity (so you're technically axiomatizing abelian groups).



Then your "dual identity" can be phrased $$mucirc (dtimes id) circ (id times Delta_G) = dcirc (mu times id)circ (id times Delta_G) = id times tau_G : Gtimes Gto G $$



Composing with $etimes id$ we get the identity
$$mucirc (dtimes id) circ (idtimes Delta_G) circ (etimes id) =
mucirc (dtimes id)circ (etimes idtimes id)circ Delta_G = mucirc (itimes id)circ Delta_G=etimes tau_G,$$

which is half of the inverses identity, and the other half we get is:
$$dcirc (mutimes id) circ (idtimes Delta_G) circ (etimes id) =
dcirc (mutimes id)circ (etimes idtimes id)circ Delta_G = dcirc Delta_G=ecirc tau_G,$$

so we just need to check $d = mucirc (id times i)$, and this follows from your
double reverse and dual substitution identities.
(We get $alpha + (-beta) = alpha - (-(-beta)) = alpha - beta$).



Conclusion



All of the properties you've listed follow from the fact that the operations you've chosen define abelian groups.



Thus the reason the triples of operators (don't forget the identity) are so similar is that they each define abelian groups.



Edit:



It's now a bit more clear to me what you're asking about. You also are interested in the relationship between these pairs/triples of operators, and how to possibly add another pair/triple.



In which case I feel the need to point out that fields don't come with two pairs of operations.



It's actually a bit easier to see this in the case of (commutative) rings.



For a general commutative ring $R$ define $a/b = acdot b^{-1}$ when $b$ is invertible.



Then the collection of all invertible elements of $R$, denoted $R^times$ forms a group, and it has identity $1$, the usual multiplication as multiplication, and the division just defined gives the division operation.



Now $R^times=R$, as sets only when $R=0$, the zero ring, since otherwise $0$ is never invertible. Thus the triple of operations $(1,*,/)$ is never actually a triple of operations on $R$, but rather a triple of operations on the related object $R^times$.



In the very special case of fields, $R^times = Rsetminus{0}$, but for say the integers, we have $Bbb{Z}^times = {1,-1}$.



Also there is an additional axiom relating the operations $+$ and $*$, the distributive law.



Thus it's not clear what you mean by adding another triple of operations.



The two triples of operations already discussed aren't defined on the same set/type to begin with, so it's not quite clear how you'd be adding a third.



Also even if you did construct a related type on which to define a third operation, this third operation should relate to the previous two in some way.



In mathematics, there are examples of rings with additional operations (though none that I can think of that form an abelian group), such as differential graded algebras, but the third operation always relates to the prior two in some way.






share|cite|improve this answer























  • Thank you so much for this explanation! I really need to dig into category theory (and probably homotopy type theory as well while I am at it). Also, sorry for the confusion regarding elements and terms. I have fixed it in the original post.
    – ismael
    Jan 4 at 20:24










  • @ismael I've edited in an additional section in response to your edits
    – jgon
    Jan 4 at 20:51










  • This is super helpful! Yes, the third operation should be defined in relation to the previous two, much like the second is defined in relation to the first. And I intentionally avoided talking about distributivity because I am still working on it, trying to figure out whether this has to be an axiom or whether it can be proven.
    – ismael
    Jan 4 at 21:16










  • Another way to reformulate my original question is this: since $(+, -, 0)$ is used to define $mathbb{Z}$ and $[(+, -, 0), (×, /, 1)]$ is used to define $mathbb{Q}$, which triplet $(#, @, r)$ could be introduced so that $[(+, -, 0), (×, /, 1), (#, @, r)]$ would define $mathbb{S}$, with $mathbb{Q} subset mathbb{S} subseteq mathbb{R}$?
    – ismael
    Jan 4 at 22:04






  • 1




    @ismael Part of my point though is that division isn't an operation on $Bbb{Q}$, since division is only defined on $Bbb{Q}timesBbb{Q}^times$. Also it's really too broad a question to ask what additional operation could be introduced without specifying how it interacts with the other two operations.
    – jgon
    Jan 4 at 22:21














7












7








7






Since you're interested in type theory and say that you therefore want an element free perspective, I'll give you the categorical perspective.



In category theory, we can define group objects in a category $C$ with finite products (including the terminal object, $*$) as an object $G$ with $mu : Gtimes G to G$ (a binary operator), $e: * to G$ (a nullary operator), and $i : Gto G$ (a unary operator) satisfying the following relations, where $Delta_G : Gto Gtimes G$ is the diagonal map and $tau_G : Gto *$ is the map to the terminal object:



Associativity:
$$mucirc (mutimes newcommandid{operatorname{id}}id) = mucirc (idtimes mu) :Gtimes Gtimes G to G$$
Identity:
$$mucirc (idtimes e)=mucirc (e times id)=id : Gtimes G$$
Inverses:
$$mucirc (idtimes i) circ Delta_G = mucirc (itimes id) circ Delta_G = ecirc tau_G : Gto G$$



Now this axiomatization is equivalent to the axiomatization you've given in your question, except that instead of inversion, you've given division as the primitive operation.



To get your data, we define division as $d=mu circ (id times i)$.



Conversely, given division $d: Gtimes Gto G$, we define $i$ by $i=dcirc (etimes id)$.



Your axiomatization gives associativity and identity for free, plus also commutativity (so you're technically axiomatizing abelian groups).



Then your "dual identity" can be phrased $$mucirc (dtimes id) circ (id times Delta_G) = dcirc (mu times id)circ (id times Delta_G) = id times tau_G : Gtimes Gto G $$



Composing with $etimes id$ we get the identity
$$mucirc (dtimes id) circ (idtimes Delta_G) circ (etimes id) =
mucirc (dtimes id)circ (etimes idtimes id)circ Delta_G = mucirc (itimes id)circ Delta_G=etimes tau_G,$$

which is half of the inverses identity, and the other half we get is:
$$dcirc (mutimes id) circ (idtimes Delta_G) circ (etimes id) =
dcirc (mutimes id)circ (etimes idtimes id)circ Delta_G = dcirc Delta_G=ecirc tau_G,$$

so we just need to check $d = mucirc (id times i)$, and this follows from your
double reverse and dual substitution identities.
(We get $alpha + (-beta) = alpha - (-(-beta)) = alpha - beta$).



Conclusion



All of the properties you've listed follow from the fact that the operations you've chosen define abelian groups.



Thus the reason the triples of operators (don't forget the identity) are so similar is that they each define abelian groups.



Edit:



It's now a bit more clear to me what you're asking about. You also are interested in the relationship between these pairs/triples of operators, and how to possibly add another pair/triple.



In which case I feel the need to point out that fields don't come with two pairs of operations.



It's actually a bit easier to see this in the case of (commutative) rings.



For a general commutative ring $R$ define $a/b = acdot b^{-1}$ when $b$ is invertible.



Then the collection of all invertible elements of $R$, denoted $R^times$ forms a group, and it has identity $1$, the usual multiplication as multiplication, and the division just defined gives the division operation.



Now $R^times=R$, as sets only when $R=0$, the zero ring, since otherwise $0$ is never invertible. Thus the triple of operations $(1,*,/)$ is never actually a triple of operations on $R$, but rather a triple of operations on the related object $R^times$.



In the very special case of fields, $R^times = Rsetminus{0}$, but for say the integers, we have $Bbb{Z}^times = {1,-1}$.



Also there is an additional axiom relating the operations $+$ and $*$, the distributive law.



Thus it's not clear what you mean by adding another triple of operations.



The two triples of operations already discussed aren't defined on the same set/type to begin with, so it's not quite clear how you'd be adding a third.



Also even if you did construct a related type on which to define a third operation, this third operation should relate to the previous two in some way.



In mathematics, there are examples of rings with additional operations (though none that I can think of that form an abelian group), such as differential graded algebras, but the third operation always relates to the prior two in some way.






share|cite|improve this answer














Since you're interested in type theory and say that you therefore want an element free perspective, I'll give you the categorical perspective.



In category theory, we can define group objects in a category $C$ with finite products (including the terminal object, $*$) as an object $G$ with $mu : Gtimes G to G$ (a binary operator), $e: * to G$ (a nullary operator), and $i : Gto G$ (a unary operator) satisfying the following relations, where $Delta_G : Gto Gtimes G$ is the diagonal map and $tau_G : Gto *$ is the map to the terminal object:



Associativity:
$$mucirc (mutimes newcommandid{operatorname{id}}id) = mucirc (idtimes mu) :Gtimes Gtimes G to G$$
Identity:
$$mucirc (idtimes e)=mucirc (e times id)=id : Gtimes G$$
Inverses:
$$mucirc (idtimes i) circ Delta_G = mucirc (itimes id) circ Delta_G = ecirc tau_G : Gto G$$



Now this axiomatization is equivalent to the axiomatization you've given in your question, except that instead of inversion, you've given division as the primitive operation.



To get your data, we define division as $d=mu circ (id times i)$.



Conversely, given division $d: Gtimes Gto G$, we define $i$ by $i=dcirc (etimes id)$.



Your axiomatization gives associativity and identity for free, plus also commutativity (so you're technically axiomatizing abelian groups).



Then your "dual identity" can be phrased $$mucirc (dtimes id) circ (id times Delta_G) = dcirc (mu times id)circ (id times Delta_G) = id times tau_G : Gtimes Gto G $$



Composing with $etimes id$ we get the identity
$$mucirc (dtimes id) circ (idtimes Delta_G) circ (etimes id) =
mucirc (dtimes id)circ (etimes idtimes id)circ Delta_G = mucirc (itimes id)circ Delta_G=etimes tau_G,$$

which is half of the inverses identity, and the other half we get is:
$$dcirc (mutimes id) circ (idtimes Delta_G) circ (etimes id) =
dcirc (mutimes id)circ (etimes idtimes id)circ Delta_G = dcirc Delta_G=ecirc tau_G,$$

so we just need to check $d = mucirc (id times i)$, and this follows from your
double reverse and dual substitution identities.
(We get $alpha + (-beta) = alpha - (-(-beta)) = alpha - beta$).



Conclusion



All of the properties you've listed follow from the fact that the operations you've chosen define abelian groups.



Thus the reason the triples of operators (don't forget the identity) are so similar is that they each define abelian groups.



Edit:



It's now a bit more clear to me what you're asking about. You also are interested in the relationship between these pairs/triples of operators, and how to possibly add another pair/triple.



In which case I feel the need to point out that fields don't come with two pairs of operations.



It's actually a bit easier to see this in the case of (commutative) rings.



For a general commutative ring $R$ define $a/b = acdot b^{-1}$ when $b$ is invertible.



Then the collection of all invertible elements of $R$, denoted $R^times$ forms a group, and it has identity $1$, the usual multiplication as multiplication, and the division just defined gives the division operation.



Now $R^times=R$, as sets only when $R=0$, the zero ring, since otherwise $0$ is never invertible. Thus the triple of operations $(1,*,/)$ is never actually a triple of operations on $R$, but rather a triple of operations on the related object $R^times$.



In the very special case of fields, $R^times = Rsetminus{0}$, but for say the integers, we have $Bbb{Z}^times = {1,-1}$.



Also there is an additional axiom relating the operations $+$ and $*$, the distributive law.



Thus it's not clear what you mean by adding another triple of operations.



The two triples of operations already discussed aren't defined on the same set/type to begin with, so it's not quite clear how you'd be adding a third.



Also even if you did construct a related type on which to define a third operation, this third operation should relate to the previous two in some way.



In mathematics, there are examples of rings with additional operations (though none that I can think of that form an abelian group), such as differential graded algebras, but the third operation always relates to the prior two in some way.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited 2 days ago

























answered Jan 4 at 20:11









jgonjgon

13.3k21941




13.3k21941












  • Thank you so much for this explanation! I really need to dig into category theory (and probably homotopy type theory as well while I am at it). Also, sorry for the confusion regarding elements and terms. I have fixed it in the original post.
    – ismael
    Jan 4 at 20:24










  • @ismael I've edited in an additional section in response to your edits
    – jgon
    Jan 4 at 20:51










  • This is super helpful! Yes, the third operation should be defined in relation to the previous two, much like the second is defined in relation to the first. And I intentionally avoided talking about distributivity because I am still working on it, trying to figure out whether this has to be an axiom or whether it can be proven.
    – ismael
    Jan 4 at 21:16










  • Another way to reformulate my original question is this: since $(+, -, 0)$ is used to define $mathbb{Z}$ and $[(+, -, 0), (×, /, 1)]$ is used to define $mathbb{Q}$, which triplet $(#, @, r)$ could be introduced so that $[(+, -, 0), (×, /, 1), (#, @, r)]$ would define $mathbb{S}$, with $mathbb{Q} subset mathbb{S} subseteq mathbb{R}$?
    – ismael
    Jan 4 at 22:04






  • 1




    @ismael Part of my point though is that division isn't an operation on $Bbb{Q}$, since division is only defined on $Bbb{Q}timesBbb{Q}^times$. Also it's really too broad a question to ask what additional operation could be introduced without specifying how it interacts with the other two operations.
    – jgon
    Jan 4 at 22:21


















  • Thank you so much for this explanation! I really need to dig into category theory (and probably homotopy type theory as well while I am at it). Also, sorry for the confusion regarding elements and terms. I have fixed it in the original post.
    – ismael
    Jan 4 at 20:24










  • @ismael I've edited in an additional section in response to your edits
    – jgon
    Jan 4 at 20:51










  • This is super helpful! Yes, the third operation should be defined in relation to the previous two, much like the second is defined in relation to the first. And I intentionally avoided talking about distributivity because I am still working on it, trying to figure out whether this has to be an axiom or whether it can be proven.
    – ismael
    Jan 4 at 21:16










  • Another way to reformulate my original question is this: since $(+, -, 0)$ is used to define $mathbb{Z}$ and $[(+, -, 0), (×, /, 1)]$ is used to define $mathbb{Q}$, which triplet $(#, @, r)$ could be introduced so that $[(+, -, 0), (×, /, 1), (#, @, r)]$ would define $mathbb{S}$, with $mathbb{Q} subset mathbb{S} subseteq mathbb{R}$?
    – ismael
    Jan 4 at 22:04






  • 1




    @ismael Part of my point though is that division isn't an operation on $Bbb{Q}$, since division is only defined on $Bbb{Q}timesBbb{Q}^times$. Also it's really too broad a question to ask what additional operation could be introduced without specifying how it interacts with the other two operations.
    – jgon
    Jan 4 at 22:21
















Thank you so much for this explanation! I really need to dig into category theory (and probably homotopy type theory as well while I am at it). Also, sorry for the confusion regarding elements and terms. I have fixed it in the original post.
– ismael
Jan 4 at 20:24




Thank you so much for this explanation! I really need to dig into category theory (and probably homotopy type theory as well while I am at it). Also, sorry for the confusion regarding elements and terms. I have fixed it in the original post.
– ismael
Jan 4 at 20:24












@ismael I've edited in an additional section in response to your edits
– jgon
Jan 4 at 20:51




@ismael I've edited in an additional section in response to your edits
– jgon
Jan 4 at 20:51












This is super helpful! Yes, the third operation should be defined in relation to the previous two, much like the second is defined in relation to the first. And I intentionally avoided talking about distributivity because I am still working on it, trying to figure out whether this has to be an axiom or whether it can be proven.
– ismael
Jan 4 at 21:16




This is super helpful! Yes, the third operation should be defined in relation to the previous two, much like the second is defined in relation to the first. And I intentionally avoided talking about distributivity because I am still working on it, trying to figure out whether this has to be an axiom or whether it can be proven.
– ismael
Jan 4 at 21:16












Another way to reformulate my original question is this: since $(+, -, 0)$ is used to define $mathbb{Z}$ and $[(+, -, 0), (×, /, 1)]$ is used to define $mathbb{Q}$, which triplet $(#, @, r)$ could be introduced so that $[(+, -, 0), (×, /, 1), (#, @, r)]$ would define $mathbb{S}$, with $mathbb{Q} subset mathbb{S} subseteq mathbb{R}$?
– ismael
Jan 4 at 22:04




Another way to reformulate my original question is this: since $(+, -, 0)$ is used to define $mathbb{Z}$ and $[(+, -, 0), (×, /, 1)]$ is used to define $mathbb{Q}$, which triplet $(#, @, r)$ could be introduced so that $[(+, -, 0), (×, /, 1), (#, @, r)]$ would define $mathbb{S}$, with $mathbb{Q} subset mathbb{S} subseteq mathbb{R}$?
– ismael
Jan 4 at 22:04




1




1




@ismael Part of my point though is that division isn't an operation on $Bbb{Q}$, since division is only defined on $Bbb{Q}timesBbb{Q}^times$. Also it's really too broad a question to ask what additional operation could be introduced without specifying how it interacts with the other two operations.
– jgon
Jan 4 at 22:21




@ismael Part of my point though is that division isn't an operation on $Bbb{Q}$, since division is only defined on $Bbb{Q}timesBbb{Q}^times$. Also it's really too broad a question to ask what additional operation could be introduced without specifying how it interacts with the other two operations.
– jgon
Jan 4 at 22:21











3














Update: A more detailed answer is available on this mathoverflow post.



Following @Henry’s suggestion, a recursive structure of abelian groups can be constructed by using commutative hyperoperations:



$p_{n+1}(a, b) = exp(p_n(ln(a), ln(b)))$



$p_0(a, b) = lnleft(e^a + e^bright)$



$p_1(a, b) = a + b$



$p_2(a, b) = acdot b = e^{ln(a) + ln(b)}$



$p_3(a, b) = a^{ln(b)} = e^{ln(a)ln(b)}$



$p_4(a, b) = e^{e^{ln(ln(a))ln(ln(b))}}$



These functions give us the sequence of $(+, ×, ...)$ operations, while their inverse functions give us the sequence of $(-, ÷, ...)$ dual operations. The sequence of identity terms is $(0, 1, e, ...)$. With this, $T_1$ (Type Level 1) is isomorphic to a group, $T_2$ is isormorphic to a field, and successive types give you more and more complex objects.



The identity terms are:



$i_n = e upuparrows (n - 2).$



$i_1 = 0.$



$i_2 = 1.$



$i_3 = e.$



$i_3 = e ^ e.$



$i_4 = e ^ {e ^ e}.$



While I cannot yet fathom what $T_4$ And successive types can be used for, I have to believe that $T_3$ is interesting, because it brings exponentiation to the table in a very natural manner. Therefore, stopping at the level of fields feels quite shortsighted.



Also, $T_1$ is isomorphic to $mathbb{Z}$ and $T_2$ is isomorphic to $mathbb{Q}$, but $T_3$ is isomorphic to a strict subset of $mathbb{R}$. This goes to suggest that the gap between $mathbb{Q}$ and $mathbb{R}$ is pretty large and should be filled incrementally with larger and larger sets. One interesting question is whether $T_n$ “converges” toward a structure that is isomorphic to $mathbb{R}$ when $n$ increases.






share|cite|improve this answer























  • The identity for $F_0$ seems to be $-infty$ so there are not inverses, while for $F_4$ is $e^e$ and for $F_n$ seems to be $euparrowuparrow(n-2)$
    – Henry
    2 days ago










  • That looks right to me. Now I need to write the inverse functions.
    – ismael
    2 days ago










  • @Henry I gave you credit on this follow-up mathoverflow post. I hope this is OK with you. If not, I am happy to remove it.
    – ismael
    2 days ago
















3














Update: A more detailed answer is available on this mathoverflow post.



Following @Henry’s suggestion, a recursive structure of abelian groups can be constructed by using commutative hyperoperations:



$p_{n+1}(a, b) = exp(p_n(ln(a), ln(b)))$



$p_0(a, b) = lnleft(e^a + e^bright)$



$p_1(a, b) = a + b$



$p_2(a, b) = acdot b = e^{ln(a) + ln(b)}$



$p_3(a, b) = a^{ln(b)} = e^{ln(a)ln(b)}$



$p_4(a, b) = e^{e^{ln(ln(a))ln(ln(b))}}$



These functions give us the sequence of $(+, ×, ...)$ operations, while their inverse functions give us the sequence of $(-, ÷, ...)$ dual operations. The sequence of identity terms is $(0, 1, e, ...)$. With this, $T_1$ (Type Level 1) is isomorphic to a group, $T_2$ is isormorphic to a field, and successive types give you more and more complex objects.



The identity terms are:



$i_n = e upuparrows (n - 2).$



$i_1 = 0.$



$i_2 = 1.$



$i_3 = e.$



$i_3 = e ^ e.$



$i_4 = e ^ {e ^ e}.$



While I cannot yet fathom what $T_4$ And successive types can be used for, I have to believe that $T_3$ is interesting, because it brings exponentiation to the table in a very natural manner. Therefore, stopping at the level of fields feels quite shortsighted.



Also, $T_1$ is isomorphic to $mathbb{Z}$ and $T_2$ is isomorphic to $mathbb{Q}$, but $T_3$ is isomorphic to a strict subset of $mathbb{R}$. This goes to suggest that the gap between $mathbb{Q}$ and $mathbb{R}$ is pretty large and should be filled incrementally with larger and larger sets. One interesting question is whether $T_n$ “converges” toward a structure that is isomorphic to $mathbb{R}$ when $n$ increases.






share|cite|improve this answer























  • The identity for $F_0$ seems to be $-infty$ so there are not inverses, while for $F_4$ is $e^e$ and for $F_n$ seems to be $euparrowuparrow(n-2)$
    – Henry
    2 days ago










  • That looks right to me. Now I need to write the inverse functions.
    – ismael
    2 days ago










  • @Henry I gave you credit on this follow-up mathoverflow post. I hope this is OK with you. If not, I am happy to remove it.
    – ismael
    2 days ago














3












3








3






Update: A more detailed answer is available on this mathoverflow post.



Following @Henry’s suggestion, a recursive structure of abelian groups can be constructed by using commutative hyperoperations:



$p_{n+1}(a, b) = exp(p_n(ln(a), ln(b)))$



$p_0(a, b) = lnleft(e^a + e^bright)$



$p_1(a, b) = a + b$



$p_2(a, b) = acdot b = e^{ln(a) + ln(b)}$



$p_3(a, b) = a^{ln(b)} = e^{ln(a)ln(b)}$



$p_4(a, b) = e^{e^{ln(ln(a))ln(ln(b))}}$



These functions give us the sequence of $(+, ×, ...)$ operations, while their inverse functions give us the sequence of $(-, ÷, ...)$ dual operations. The sequence of identity terms is $(0, 1, e, ...)$. With this, $T_1$ (Type Level 1) is isomorphic to a group, $T_2$ is isormorphic to a field, and successive types give you more and more complex objects.



The identity terms are:



$i_n = e upuparrows (n - 2).$



$i_1 = 0.$



$i_2 = 1.$



$i_3 = e.$



$i_3 = e ^ e.$



$i_4 = e ^ {e ^ e}.$



While I cannot yet fathom what $T_4$ And successive types can be used for, I have to believe that $T_3$ is interesting, because it brings exponentiation to the table in a very natural manner. Therefore, stopping at the level of fields feels quite shortsighted.



Also, $T_1$ is isomorphic to $mathbb{Z}$ and $T_2$ is isomorphic to $mathbb{Q}$, but $T_3$ is isomorphic to a strict subset of $mathbb{R}$. This goes to suggest that the gap between $mathbb{Q}$ and $mathbb{R}$ is pretty large and should be filled incrementally with larger and larger sets. One interesting question is whether $T_n$ “converges” toward a structure that is isomorphic to $mathbb{R}$ when $n$ increases.






share|cite|improve this answer














Update: A more detailed answer is available on this mathoverflow post.



Following @Henry’s suggestion, a recursive structure of abelian groups can be constructed by using commutative hyperoperations:



$p_{n+1}(a, b) = exp(p_n(ln(a), ln(b)))$



$p_0(a, b) = lnleft(e^a + e^bright)$



$p_1(a, b) = a + b$



$p_2(a, b) = acdot b = e^{ln(a) + ln(b)}$



$p_3(a, b) = a^{ln(b)} = e^{ln(a)ln(b)}$



$p_4(a, b) = e^{e^{ln(ln(a))ln(ln(b))}}$



These functions give us the sequence of $(+, ×, ...)$ operations, while their inverse functions give us the sequence of $(-, ÷, ...)$ dual operations. The sequence of identity terms is $(0, 1, e, ...)$. With this, $T_1$ (Type Level 1) is isomorphic to a group, $T_2$ is isormorphic to a field, and successive types give you more and more complex objects.



The identity terms are:



$i_n = e upuparrows (n - 2).$



$i_1 = 0.$



$i_2 = 1.$



$i_3 = e.$



$i_3 = e ^ e.$



$i_4 = e ^ {e ^ e}.$



While I cannot yet fathom what $T_4$ And successive types can be used for, I have to believe that $T_3$ is interesting, because it brings exponentiation to the table in a very natural manner. Therefore, stopping at the level of fields feels quite shortsighted.



Also, $T_1$ is isomorphic to $mathbb{Z}$ and $T_2$ is isomorphic to $mathbb{Q}$, but $T_3$ is isomorphic to a strict subset of $mathbb{R}$. This goes to suggest that the gap between $mathbb{Q}$ and $mathbb{R}$ is pretty large and should be filled incrementally with larger and larger sets. One interesting question is whether $T_n$ “converges” toward a structure that is isomorphic to $mathbb{R}$ when $n$ increases.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited 2 days ago

























answered Jan 5 at 3:06









ismaelismael

274213




274213












  • The identity for $F_0$ seems to be $-infty$ so there are not inverses, while for $F_4$ is $e^e$ and for $F_n$ seems to be $euparrowuparrow(n-2)$
    – Henry
    2 days ago










  • That looks right to me. Now I need to write the inverse functions.
    – ismael
    2 days ago










  • @Henry I gave you credit on this follow-up mathoverflow post. I hope this is OK with you. If not, I am happy to remove it.
    – ismael
    2 days ago


















  • The identity for $F_0$ seems to be $-infty$ so there are not inverses, while for $F_4$ is $e^e$ and for $F_n$ seems to be $euparrowuparrow(n-2)$
    – Henry
    2 days ago










  • That looks right to me. Now I need to write the inverse functions.
    – ismael
    2 days ago










  • @Henry I gave you credit on this follow-up mathoverflow post. I hope this is OK with you. If not, I am happy to remove it.
    – ismael
    2 days ago
















The identity for $F_0$ seems to be $-infty$ so there are not inverses, while for $F_4$ is $e^e$ and for $F_n$ seems to be $euparrowuparrow(n-2)$
– Henry
2 days ago




The identity for $F_0$ seems to be $-infty$ so there are not inverses, while for $F_4$ is $e^e$ and for $F_n$ seems to be $euparrowuparrow(n-2)$
– Henry
2 days ago












That looks right to me. Now I need to write the inverse functions.
– ismael
2 days ago




That looks right to me. Now I need to write the inverse functions.
– ismael
2 days ago












@Henry I gave you credit on this follow-up mathoverflow post. I hope this is OK with you. If not, I am happy to remove it.
– ismael
2 days ago




@Henry I gave you credit on this follow-up mathoverflow post. I hope this is OK with you. If not, I am happy to remove it.
– ismael
2 days ago











2














Here is my best attempt at answering this question; but the answer I have is likely to be disappointing.




If $(+, -)$ with 0 as identity element defines a group and $[(+, -), (×, /)]$ with 1 as identity element for $(×, /)$ defines a field, what is defined by $[(+, -), (×, /), (#, @)]$ with an identity element for $(#, @)$ other than 0 and 1?




As far as I know, no such thing has been studied any significant amount.



As you've noticed, any field has two corresponding groups: its additive group and its multiplicative group. These two groups have different identity elements.



I'm not aware of any kind of algebraic structure which has three corresponding groups. And nobody is going to study such things, or name them, until someone has found an interesting example of such a thing.






share|cite|improve this answer

















  • 1




    That’s a totally valid answer. Indeed, I do not like it, but it’s quite possibly the best answer so far.
    – ismael
    Jan 4 at 21:34










  • I guess @Henry just found one: $a # b = a ^{log(b)} = b ^{log(a)}$ and $a @ b = a ^ frac{1}{log(b)}$ with $e$ as identity term. Having this providing a third group built on top of the previous two is a more elegant way of stating that exponentiation is repeated multiplication. And it makes me wonder what comes next after 0, 1, and e...
    – ismael
    Jan 5 at 2:24


















2














Here is my best attempt at answering this question; but the answer I have is likely to be disappointing.




If $(+, -)$ with 0 as identity element defines a group and $[(+, -), (×, /)]$ with 1 as identity element for $(×, /)$ defines a field, what is defined by $[(+, -), (×, /), (#, @)]$ with an identity element for $(#, @)$ other than 0 and 1?




As far as I know, no such thing has been studied any significant amount.



As you've noticed, any field has two corresponding groups: its additive group and its multiplicative group. These two groups have different identity elements.



I'm not aware of any kind of algebraic structure which has three corresponding groups. And nobody is going to study such things, or name them, until someone has found an interesting example of such a thing.






share|cite|improve this answer

















  • 1




    That’s a totally valid answer. Indeed, I do not like it, but it’s quite possibly the best answer so far.
    – ismael
    Jan 4 at 21:34










  • I guess @Henry just found one: $a # b = a ^{log(b)} = b ^{log(a)}$ and $a @ b = a ^ frac{1}{log(b)}$ with $e$ as identity term. Having this providing a third group built on top of the previous two is a more elegant way of stating that exponentiation is repeated multiplication. And it makes me wonder what comes next after 0, 1, and e...
    – ismael
    Jan 5 at 2:24
















2












2








2






Here is my best attempt at answering this question; but the answer I have is likely to be disappointing.




If $(+, -)$ with 0 as identity element defines a group and $[(+, -), (×, /)]$ with 1 as identity element for $(×, /)$ defines a field, what is defined by $[(+, -), (×, /), (#, @)]$ with an identity element for $(#, @)$ other than 0 and 1?




As far as I know, no such thing has been studied any significant amount.



As you've noticed, any field has two corresponding groups: its additive group and its multiplicative group. These two groups have different identity elements.



I'm not aware of any kind of algebraic structure which has three corresponding groups. And nobody is going to study such things, or name them, until someone has found an interesting example of such a thing.






share|cite|improve this answer












Here is my best attempt at answering this question; but the answer I have is likely to be disappointing.




If $(+, -)$ with 0 as identity element defines a group and $[(+, -), (×, /)]$ with 1 as identity element for $(×, /)$ defines a field, what is defined by $[(+, -), (×, /), (#, @)]$ with an identity element for $(#, @)$ other than 0 and 1?




As far as I know, no such thing has been studied any significant amount.



As you've noticed, any field has two corresponding groups: its additive group and its multiplicative group. These two groups have different identity elements.



I'm not aware of any kind of algebraic structure which has three corresponding groups. And nobody is going to study such things, or name them, until someone has found an interesting example of such a thing.







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Jan 4 at 21:32









Tanner SwettTanner Swett

4,0141638




4,0141638








  • 1




    That’s a totally valid answer. Indeed, I do not like it, but it’s quite possibly the best answer so far.
    – ismael
    Jan 4 at 21:34










  • I guess @Henry just found one: $a # b = a ^{log(b)} = b ^{log(a)}$ and $a @ b = a ^ frac{1}{log(b)}$ with $e$ as identity term. Having this providing a third group built on top of the previous two is a more elegant way of stating that exponentiation is repeated multiplication. And it makes me wonder what comes next after 0, 1, and e...
    – ismael
    Jan 5 at 2:24
















  • 1




    That’s a totally valid answer. Indeed, I do not like it, but it’s quite possibly the best answer so far.
    – ismael
    Jan 4 at 21:34










  • I guess @Henry just found one: $a # b = a ^{log(b)} = b ^{log(a)}$ and $a @ b = a ^ frac{1}{log(b)}$ with $e$ as identity term. Having this providing a third group built on top of the previous two is a more elegant way of stating that exponentiation is repeated multiplication. And it makes me wonder what comes next after 0, 1, and e...
    – ismael
    Jan 5 at 2:24










1




1




That’s a totally valid answer. Indeed, I do not like it, but it’s quite possibly the best answer so far.
– ismael
Jan 4 at 21:34




That’s a totally valid answer. Indeed, I do not like it, but it’s quite possibly the best answer so far.
– ismael
Jan 4 at 21:34












I guess @Henry just found one: $a # b = a ^{log(b)} = b ^{log(a)}$ and $a @ b = a ^ frac{1}{log(b)}$ with $e$ as identity term. Having this providing a third group built on top of the previous two is a more elegant way of stating that exponentiation is repeated multiplication. And it makes me wonder what comes next after 0, 1, and e...
– ismael
Jan 5 at 2:24






I guess @Henry just found one: $a # b = a ^{log(b)} = b ^{log(a)}$ and $a @ b = a ^ frac{1}{log(b)}$ with $e$ as identity term. Having this providing a third group built on top of the previous two is a more elegant way of stating that exponentiation is repeated multiplication. And it makes me wonder what comes next after 0, 1, and e...
– ismael
Jan 5 at 2:24













1














$log$ turns multiplication and division into addition and subtraction. The precise statement is that $log: mathbb R^+ to mathbb R$ is a group isomorphism, whose inverse is $exp$.






share|cite|improve this answer

















  • 1




    Excellent point, but this is more a corollary rather than a justification. And the definition of such functions for the pair of identity elements $(0, 1)$ makes me wonder what functions could be defined with other pairs or additional operators defined with other identity elements like $-1$, $e$, or $pi$.
    – ismael
    Jan 4 at 19:28








  • 1




    The extension could then be that $a ,#, b = a^{log(b)} = b^{log(a)}$ on $mathbb R_{gt 1}$, so $a ,@, b = a^{1/log(b)}$ with an identity which is $e$ for natural logarithms, and then $a ,#, exp(1/log(a)) = e$ showing the inverse
    – Henry
    Jan 4 at 23:08










  • @Henry This is it! Beautiful! Thank you so very much.
    – ismael
    Jan 5 at 2:10










  • @Henry Could you make this an answer to the question so that It can be closed?
    – ismael
    Jan 5 at 2:10










  • @Henry With this result, we now have a very “natural” series 0, 1, e. This begs the following question: what comes next? I mean, this is too beautiful to be a simple coincidence...
    – ismael
    Jan 5 at 2:29
















1














$log$ turns multiplication and division into addition and subtraction. The precise statement is that $log: mathbb R^+ to mathbb R$ is a group isomorphism, whose inverse is $exp$.






share|cite|improve this answer

















  • 1




    Excellent point, but this is more a corollary rather than a justification. And the definition of such functions for the pair of identity elements $(0, 1)$ makes me wonder what functions could be defined with other pairs or additional operators defined with other identity elements like $-1$, $e$, or $pi$.
    – ismael
    Jan 4 at 19:28








  • 1




    The extension could then be that $a ,#, b = a^{log(b)} = b^{log(a)}$ on $mathbb R_{gt 1}$, so $a ,@, b = a^{1/log(b)}$ with an identity which is $e$ for natural logarithms, and then $a ,#, exp(1/log(a)) = e$ showing the inverse
    – Henry
    Jan 4 at 23:08










  • @Henry This is it! Beautiful! Thank you so very much.
    – ismael
    Jan 5 at 2:10










  • @Henry Could you make this an answer to the question so that It can be closed?
    – ismael
    Jan 5 at 2:10










  • @Henry With this result, we now have a very “natural” series 0, 1, e. This begs the following question: what comes next? I mean, this is too beautiful to be a simple coincidence...
    – ismael
    Jan 5 at 2:29














1












1








1






$log$ turns multiplication and division into addition and subtraction. The precise statement is that $log: mathbb R^+ to mathbb R$ is a group isomorphism, whose inverse is $exp$.






share|cite|improve this answer












$log$ turns multiplication and division into addition and subtraction. The precise statement is that $log: mathbb R^+ to mathbb R$ is a group isomorphism, whose inverse is $exp$.







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Jan 4 at 19:19









lhflhf

163k10167387




163k10167387








  • 1




    Excellent point, but this is more a corollary rather than a justification. And the definition of such functions for the pair of identity elements $(0, 1)$ makes me wonder what functions could be defined with other pairs or additional operators defined with other identity elements like $-1$, $e$, or $pi$.
    – ismael
    Jan 4 at 19:28








  • 1




    The extension could then be that $a ,#, b = a^{log(b)} = b^{log(a)}$ on $mathbb R_{gt 1}$, so $a ,@, b = a^{1/log(b)}$ with an identity which is $e$ for natural logarithms, and then $a ,#, exp(1/log(a)) = e$ showing the inverse
    – Henry
    Jan 4 at 23:08










  • @Henry This is it! Beautiful! Thank you so very much.
    – ismael
    Jan 5 at 2:10










  • @Henry Could you make this an answer to the question so that It can be closed?
    – ismael
    Jan 5 at 2:10










  • @Henry With this result, we now have a very “natural” series 0, 1, e. This begs the following question: what comes next? I mean, this is too beautiful to be a simple coincidence...
    – ismael
    Jan 5 at 2:29














  • 1




    Excellent point, but this is more a corollary rather than a justification. And the definition of such functions for the pair of identity elements $(0, 1)$ makes me wonder what functions could be defined with other pairs or additional operators defined with other identity elements like $-1$, $e$, or $pi$.
    – ismael
    Jan 4 at 19:28








  • 1




    The extension could then be that $a ,#, b = a^{log(b)} = b^{log(a)}$ on $mathbb R_{gt 1}$, so $a ,@, b = a^{1/log(b)}$ with an identity which is $e$ for natural logarithms, and then $a ,#, exp(1/log(a)) = e$ showing the inverse
    – Henry
    Jan 4 at 23:08










  • @Henry This is it! Beautiful! Thank you so very much.
    – ismael
    Jan 5 at 2:10










  • @Henry Could you make this an answer to the question so that It can be closed?
    – ismael
    Jan 5 at 2:10










  • @Henry With this result, we now have a very “natural” series 0, 1, e. This begs the following question: what comes next? I mean, this is too beautiful to be a simple coincidence...
    – ismael
    Jan 5 at 2:29








1




1




Excellent point, but this is more a corollary rather than a justification. And the definition of such functions for the pair of identity elements $(0, 1)$ makes me wonder what functions could be defined with other pairs or additional operators defined with other identity elements like $-1$, $e$, or $pi$.
– ismael
Jan 4 at 19:28






Excellent point, but this is more a corollary rather than a justification. And the definition of such functions for the pair of identity elements $(0, 1)$ makes me wonder what functions could be defined with other pairs or additional operators defined with other identity elements like $-1$, $e$, or $pi$.
– ismael
Jan 4 at 19:28






1




1




The extension could then be that $a ,#, b = a^{log(b)} = b^{log(a)}$ on $mathbb R_{gt 1}$, so $a ,@, b = a^{1/log(b)}$ with an identity which is $e$ for natural logarithms, and then $a ,#, exp(1/log(a)) = e$ showing the inverse
– Henry
Jan 4 at 23:08




The extension could then be that $a ,#, b = a^{log(b)} = b^{log(a)}$ on $mathbb R_{gt 1}$, so $a ,@, b = a^{1/log(b)}$ with an identity which is $e$ for natural logarithms, and then $a ,#, exp(1/log(a)) = e$ showing the inverse
– Henry
Jan 4 at 23:08












@Henry This is it! Beautiful! Thank you so very much.
– ismael
Jan 5 at 2:10




@Henry This is it! Beautiful! Thank you so very much.
– ismael
Jan 5 at 2:10












@Henry Could you make this an answer to the question so that It can be closed?
– ismael
Jan 5 at 2:10




@Henry Could you make this an answer to the question so that It can be closed?
– ismael
Jan 5 at 2:10












@Henry With this result, we now have a very “natural” series 0, 1, e. This begs the following question: what comes next? I mean, this is too beautiful to be a simple coincidence...
– ismael
Jan 5 at 2:29




@Henry With this result, we now have a very “natural” series 0, 1, e. This begs the following question: what comes next? I mean, this is too beautiful to be a simple coincidence...
– ismael
Jan 5 at 2:29











1














How do you define these operations? If it's the primary-school way on real numbers, it follows from the facts that (i) reals form an Abelian group under $+$, its identity element famously named $0$, and (ii) reals $ne 0$ form an Abelian group under $times$. (Note this guarantees many similarities follow from group theory.) This, together with $atimes (b+c)=atimes b+atimes c$ (we say $times$ distributes over $+$), defines a field. Maths has a lot of groups, and a lot of fields; and where you have fields, you have two very similar operations.






share|cite|improve this answer





















  • I define them on top of a coinductive type as outlined in this notabook.
    – ismael
    Jan 4 at 19:34
















1














How do you define these operations? If it's the primary-school way on real numbers, it follows from the facts that (i) reals form an Abelian group under $+$, its identity element famously named $0$, and (ii) reals $ne 0$ form an Abelian group under $times$. (Note this guarantees many similarities follow from group theory.) This, together with $atimes (b+c)=atimes b+atimes c$ (we say $times$ distributes over $+$), defines a field. Maths has a lot of groups, and a lot of fields; and where you have fields, you have two very similar operations.






share|cite|improve this answer





















  • I define them on top of a coinductive type as outlined in this notabook.
    – ismael
    Jan 4 at 19:34














1












1








1






How do you define these operations? If it's the primary-school way on real numbers, it follows from the facts that (i) reals form an Abelian group under $+$, its identity element famously named $0$, and (ii) reals $ne 0$ form an Abelian group under $times$. (Note this guarantees many similarities follow from group theory.) This, together with $atimes (b+c)=atimes b+atimes c$ (we say $times$ distributes over $+$), defines a field. Maths has a lot of groups, and a lot of fields; and where you have fields, you have two very similar operations.






share|cite|improve this answer












How do you define these operations? If it's the primary-school way on real numbers, it follows from the facts that (i) reals form an Abelian group under $+$, its identity element famously named $0$, and (ii) reals $ne 0$ form an Abelian group under $times$. (Note this guarantees many similarities follow from group theory.) This, together with $atimes (b+c)=atimes b+atimes c$ (we say $times$ distributes over $+$), defines a field. Maths has a lot of groups, and a lot of fields; and where you have fields, you have two very similar operations.







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Jan 4 at 19:27









J.G.J.G.

23.3k22137




23.3k22137












  • I define them on top of a coinductive type as outlined in this notabook.
    – ismael
    Jan 4 at 19:34


















  • I define them on top of a coinductive type as outlined in this notabook.
    – ismael
    Jan 4 at 19:34
















I define them on top of a coinductive type as outlined in this notabook.
– ismael
Jan 4 at 19:34




I define them on top of a coinductive type as outlined in this notabook.
– ismael
Jan 4 at 19:34











0














I think what's going on is this:



Suppose you have any binary relation $star$ making $X$ an abelian group. One way to express the relation is that it is a subset $Ssubseteq (Xtimes X)times X$ where $astar b=c$ iff $((a,b),c)in S$.



You can immediately form a new relation $S'={((c,a),b)mid (a,b,c)in S}$, and that describes a different binary operation. The fact that $S$ was formed from an abelian group operation allows you to say that this actually is a function.



And you can repeat this again to get $S''={((b,c),a)mid (a,b,c)in S}$, but it isn't as obvious that it is a function from it's origin from $S'$, but we can appeal again to $S$ again to prove it is a function.



Repeating the trick a third time gets you back to $S$.



If you take the special case where $star$ is addition, you'll find that $S'$ is subtraction where the thing subtracted is on the right, and $S''$ is like subtraction where the thing subtracted is on the left.



All this means, I think, is that the binary operations for some groups that we are all very familiar with can be translated to this new funky ordering, and because of the group properties contained in $S$, you will have a standard set of properties available in $S'$ (and also perhaps a slightly different set for $S''$, I didn't check).



My gut feeling is that the set of group axioms on $S'$ is equivalent in some sense to the abeilan group axioms encompassed in $S$, so that we really haven't learned anything new, really, we've just rewritten all the addition in terms of subtraction, and all the division in terms of multiplication. It does not feel like there is anything significant in this process.






share|cite|improve this answer





















  • Well, I am not so sure. First, I am not using set theory, I am using coinductive type theory, therefore some axioms of set theory are not necessary. Second, by defining subtraction before addition, I can deal with measures like temperatures that do not support addition (this is a very big deal for physicists and statisticians). Third, my real question is related to the values picked for the identity terms: what happens when these values are not $(0, 1)$? Or what happens when you add a third pair of operators with a third identity term (say -1)? Has anyone worked on this yet?
    – ismael
    Jan 4 at 19:58








  • 1




    I don't understand your first and third points. I think I understand your second point, and I'm suggesting that even though that is fine, it probably amounts to the same thing as addition in the end.
    – rschwieb
    Jan 4 at 20:00












  • I don’t think it does. While you can add a temperature delta to a temperature, you cannot add two temperature. This suggests that the addition operator should not have the cartesian product of the same set as domain, but the cartesian product of a type that does not support addition with a type that does support addition (I’m not sure that talking about cartesian product for types makes perfect sense, but this is a totally different subject). This kind of hybrid domain is not allowed by groups or fields unfortunately...
    – ismael
    Jan 4 at 20:02












  • @ismael I'm not sure I agree with your specific example, that you can't add temperatures, but it sounds like you're talking about the idea of an affine space. Also usually the Cartesian product of types is the type of pairs. E.g., in Haskell notation for algebraic data types, the cartesian product of a and b is Pair a b, where data Pair a b = Pair a b. (Although in Haskell, you would usually just use the built-in type (a,b)).
    – jgon
    Jan 4 at 20:57












  • @jgon You’re totally on point: I am trying to define my types in a more “affine” manner, hence the names I gave to some of the properties of their operators. Also, it seems to me that by using such “affine” expressions for the axiomatic definition of the types, you end up with less axioms than if you don’t, which I view as a plus. For example, in the case of the multiplication, it is defined with a single axiom, from which we can proove the identity term, associativity, and commutativity. These “affine” expressions are really quite powerful in that respect.
    – ismael
    Jan 4 at 21:18


















0














I think what's going on is this:



Suppose you have any binary relation $star$ making $X$ an abelian group. One way to express the relation is that it is a subset $Ssubseteq (Xtimes X)times X$ where $astar b=c$ iff $((a,b),c)in S$.



You can immediately form a new relation $S'={((c,a),b)mid (a,b,c)in S}$, and that describes a different binary operation. The fact that $S$ was formed from an abelian group operation allows you to say that this actually is a function.



And you can repeat this again to get $S''={((b,c),a)mid (a,b,c)in S}$, but it isn't as obvious that it is a function from it's origin from $S'$, but we can appeal again to $S$ again to prove it is a function.



Repeating the trick a third time gets you back to $S$.



If you take the special case where $star$ is addition, you'll find that $S'$ is subtraction where the thing subtracted is on the right, and $S''$ is like subtraction where the thing subtracted is on the left.



All this means, I think, is that the binary operations for some groups that we are all very familiar with can be translated to this new funky ordering, and because of the group properties contained in $S$, you will have a standard set of properties available in $S'$ (and also perhaps a slightly different set for $S''$, I didn't check).



My gut feeling is that the set of group axioms on $S'$ is equivalent in some sense to the abeilan group axioms encompassed in $S$, so that we really haven't learned anything new, really, we've just rewritten all the addition in terms of subtraction, and all the division in terms of multiplication. It does not feel like there is anything significant in this process.






share|cite|improve this answer





















  • Well, I am not so sure. First, I am not using set theory, I am using coinductive type theory, therefore some axioms of set theory are not necessary. Second, by defining subtraction before addition, I can deal with measures like temperatures that do not support addition (this is a very big deal for physicists and statisticians). Third, my real question is related to the values picked for the identity terms: what happens when these values are not $(0, 1)$? Or what happens when you add a third pair of operators with a third identity term (say -1)? Has anyone worked on this yet?
    – ismael
    Jan 4 at 19:58








  • 1




    I don't understand your first and third points. I think I understand your second point, and I'm suggesting that even though that is fine, it probably amounts to the same thing as addition in the end.
    – rschwieb
    Jan 4 at 20:00












  • I don’t think it does. While you can add a temperature delta to a temperature, you cannot add two temperature. This suggests that the addition operator should not have the cartesian product of the same set as domain, but the cartesian product of a type that does not support addition with a type that does support addition (I’m not sure that talking about cartesian product for types makes perfect sense, but this is a totally different subject). This kind of hybrid domain is not allowed by groups or fields unfortunately...
    – ismael
    Jan 4 at 20:02












  • @ismael I'm not sure I agree with your specific example, that you can't add temperatures, but it sounds like you're talking about the idea of an affine space. Also usually the Cartesian product of types is the type of pairs. E.g., in Haskell notation for algebraic data types, the cartesian product of a and b is Pair a b, where data Pair a b = Pair a b. (Although in Haskell, you would usually just use the built-in type (a,b)).
    – jgon
    Jan 4 at 20:57












  • @jgon You’re totally on point: I am trying to define my types in a more “affine” manner, hence the names I gave to some of the properties of their operators. Also, it seems to me that by using such “affine” expressions for the axiomatic definition of the types, you end up with less axioms than if you don’t, which I view as a plus. For example, in the case of the multiplication, it is defined with a single axiom, from which we can proove the identity term, associativity, and commutativity. These “affine” expressions are really quite powerful in that respect.
    – ismael
    Jan 4 at 21:18
















0












0








0






I think what's going on is this:



Suppose you have any binary relation $star$ making $X$ an abelian group. One way to express the relation is that it is a subset $Ssubseteq (Xtimes X)times X$ where $astar b=c$ iff $((a,b),c)in S$.



You can immediately form a new relation $S'={((c,a),b)mid (a,b,c)in S}$, and that describes a different binary operation. The fact that $S$ was formed from an abelian group operation allows you to say that this actually is a function.



And you can repeat this again to get $S''={((b,c),a)mid (a,b,c)in S}$, but it isn't as obvious that it is a function from it's origin from $S'$, but we can appeal again to $S$ again to prove it is a function.



Repeating the trick a third time gets you back to $S$.



If you take the special case where $star$ is addition, you'll find that $S'$ is subtraction where the thing subtracted is on the right, and $S''$ is like subtraction where the thing subtracted is on the left.



All this means, I think, is that the binary operations for some groups that we are all very familiar with can be translated to this new funky ordering, and because of the group properties contained in $S$, you will have a standard set of properties available in $S'$ (and also perhaps a slightly different set for $S''$, I didn't check).



My gut feeling is that the set of group axioms on $S'$ is equivalent in some sense to the abeilan group axioms encompassed in $S$, so that we really haven't learned anything new, really, we've just rewritten all the addition in terms of subtraction, and all the division in terms of multiplication. It does not feel like there is anything significant in this process.






share|cite|improve this answer












I think what's going on is this:



Suppose you have any binary relation $star$ making $X$ an abelian group. One way to express the relation is that it is a subset $Ssubseteq (Xtimes X)times X$ where $astar b=c$ iff $((a,b),c)in S$.



You can immediately form a new relation $S'={((c,a),b)mid (a,b,c)in S}$, and that describes a different binary operation. The fact that $S$ was formed from an abelian group operation allows you to say that this actually is a function.



And you can repeat this again to get $S''={((b,c),a)mid (a,b,c)in S}$, but it isn't as obvious that it is a function from it's origin from $S'$, but we can appeal again to $S$ again to prove it is a function.



Repeating the trick a third time gets you back to $S$.



If you take the special case where $star$ is addition, you'll find that $S'$ is subtraction where the thing subtracted is on the right, and $S''$ is like subtraction where the thing subtracted is on the left.



All this means, I think, is that the binary operations for some groups that we are all very familiar with can be translated to this new funky ordering, and because of the group properties contained in $S$, you will have a standard set of properties available in $S'$ (and also perhaps a slightly different set for $S''$, I didn't check).



My gut feeling is that the set of group axioms on $S'$ is equivalent in some sense to the abeilan group axioms encompassed in $S$, so that we really haven't learned anything new, really, we've just rewritten all the addition in terms of subtraction, and all the division in terms of multiplication. It does not feel like there is anything significant in this process.







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Jan 4 at 19:52









rschwiebrschwieb

105k12100244




105k12100244












  • Well, I am not so sure. First, I am not using set theory, I am using coinductive type theory, therefore some axioms of set theory are not necessary. Second, by defining subtraction before addition, I can deal with measures like temperatures that do not support addition (this is a very big deal for physicists and statisticians). Third, my real question is related to the values picked for the identity terms: what happens when these values are not $(0, 1)$? Or what happens when you add a third pair of operators with a third identity term (say -1)? Has anyone worked on this yet?
    – ismael
    Jan 4 at 19:58








  • 1




    I don't understand your first and third points. I think I understand your second point, and I'm suggesting that even though that is fine, it probably amounts to the same thing as addition in the end.
    – rschwieb
    Jan 4 at 20:00












  • I don’t think it does. While you can add a temperature delta to a temperature, you cannot add two temperature. This suggests that the addition operator should not have the cartesian product of the same set as domain, but the cartesian product of a type that does not support addition with a type that does support addition (I’m not sure that talking about cartesian product for types makes perfect sense, but this is a totally different subject). This kind of hybrid domain is not allowed by groups or fields unfortunately...
    – ismael
    Jan 4 at 20:02












  • @ismael I'm not sure I agree with your specific example, that you can't add temperatures, but it sounds like you're talking about the idea of an affine space. Also usually the Cartesian product of types is the type of pairs. E.g., in Haskell notation for algebraic data types, the cartesian product of a and b is Pair a b, where data Pair a b = Pair a b. (Although in Haskell, you would usually just use the built-in type (a,b)).
    – jgon
    Jan 4 at 20:57












  • @jgon You’re totally on point: I am trying to define my types in a more “affine” manner, hence the names I gave to some of the properties of their operators. Also, it seems to me that by using such “affine” expressions for the axiomatic definition of the types, you end up with less axioms than if you don’t, which I view as a plus. For example, in the case of the multiplication, it is defined with a single axiom, from which we can proove the identity term, associativity, and commutativity. These “affine” expressions are really quite powerful in that respect.
    – ismael
    Jan 4 at 21:18




















  • Well, I am not so sure. First, I am not using set theory, I am using coinductive type theory, therefore some axioms of set theory are not necessary. Second, by defining subtraction before addition, I can deal with measures like temperatures that do not support addition (this is a very big deal for physicists and statisticians). Third, my real question is related to the values picked for the identity terms: what happens when these values are not $(0, 1)$? Or what happens when you add a third pair of operators with a third identity term (say -1)? Has anyone worked on this yet?
    – ismael
    Jan 4 at 19:58








  • 1




    I don't understand your first and third points. I think I understand your second point, and I'm suggesting that even though that is fine, it probably amounts to the same thing as addition in the end.
    – rschwieb
    Jan 4 at 20:00












  • I don’t think it does. While you can add a temperature delta to a temperature, you cannot add two temperature. This suggests that the addition operator should not have the cartesian product of the same set as domain, but the cartesian product of a type that does not support addition with a type that does support addition (I’m not sure that talking about cartesian product for types makes perfect sense, but this is a totally different subject). This kind of hybrid domain is not allowed by groups or fields unfortunately...
    – ismael
    Jan 4 at 20:02












  • @ismael I'm not sure I agree with your specific example, that you can't add temperatures, but it sounds like you're talking about the idea of an affine space. Also usually the Cartesian product of types is the type of pairs. E.g., in Haskell notation for algebraic data types, the cartesian product of a and b is Pair a b, where data Pair a b = Pair a b. (Although in Haskell, you would usually just use the built-in type (a,b)).
    – jgon
    Jan 4 at 20:57












  • @jgon You’re totally on point: I am trying to define my types in a more “affine” manner, hence the names I gave to some of the properties of their operators. Also, it seems to me that by using such “affine” expressions for the axiomatic definition of the types, you end up with less axioms than if you don’t, which I view as a plus. For example, in the case of the multiplication, it is defined with a single axiom, from which we can proove the identity term, associativity, and commutativity. These “affine” expressions are really quite powerful in that respect.
    – ismael
    Jan 4 at 21:18


















Well, I am not so sure. First, I am not using set theory, I am using coinductive type theory, therefore some axioms of set theory are not necessary. Second, by defining subtraction before addition, I can deal with measures like temperatures that do not support addition (this is a very big deal for physicists and statisticians). Third, my real question is related to the values picked for the identity terms: what happens when these values are not $(0, 1)$? Or what happens when you add a third pair of operators with a third identity term (say -1)? Has anyone worked on this yet?
– ismael
Jan 4 at 19:58






Well, I am not so sure. First, I am not using set theory, I am using coinductive type theory, therefore some axioms of set theory are not necessary. Second, by defining subtraction before addition, I can deal with measures like temperatures that do not support addition (this is a very big deal for physicists and statisticians). Third, my real question is related to the values picked for the identity terms: what happens when these values are not $(0, 1)$? Or what happens when you add a third pair of operators with a third identity term (say -1)? Has anyone worked on this yet?
– ismael
Jan 4 at 19:58






1




1




I don't understand your first and third points. I think I understand your second point, and I'm suggesting that even though that is fine, it probably amounts to the same thing as addition in the end.
– rschwieb
Jan 4 at 20:00






I don't understand your first and third points. I think I understand your second point, and I'm suggesting that even though that is fine, it probably amounts to the same thing as addition in the end.
– rschwieb
Jan 4 at 20:00














I don’t think it does. While you can add a temperature delta to a temperature, you cannot add two temperature. This suggests that the addition operator should not have the cartesian product of the same set as domain, but the cartesian product of a type that does not support addition with a type that does support addition (I’m not sure that talking about cartesian product for types makes perfect sense, but this is a totally different subject). This kind of hybrid domain is not allowed by groups or fields unfortunately...
– ismael
Jan 4 at 20:02






I don’t think it does. While you can add a temperature delta to a temperature, you cannot add two temperature. This suggests that the addition operator should not have the cartesian product of the same set as domain, but the cartesian product of a type that does not support addition with a type that does support addition (I’m not sure that talking about cartesian product for types makes perfect sense, but this is a totally different subject). This kind of hybrid domain is not allowed by groups or fields unfortunately...
– ismael
Jan 4 at 20:02














@ismael I'm not sure I agree with your specific example, that you can't add temperatures, but it sounds like you're talking about the idea of an affine space. Also usually the Cartesian product of types is the type of pairs. E.g., in Haskell notation for algebraic data types, the cartesian product of a and b is Pair a b, where data Pair a b = Pair a b. (Although in Haskell, you would usually just use the built-in type (a,b)).
– jgon
Jan 4 at 20:57






@ismael I'm not sure I agree with your specific example, that you can't add temperatures, but it sounds like you're talking about the idea of an affine space. Also usually the Cartesian product of types is the type of pairs. E.g., in Haskell notation for algebraic data types, the cartesian product of a and b is Pair a b, where data Pair a b = Pair a b. (Although in Haskell, you would usually just use the built-in type (a,b)).
– jgon
Jan 4 at 20:57














@jgon You’re totally on point: I am trying to define my types in a more “affine” manner, hence the names I gave to some of the properties of their operators. Also, it seems to me that by using such “affine” expressions for the axiomatic definition of the types, you end up with less axioms than if you don’t, which I view as a plus. For example, in the case of the multiplication, it is defined with a single axiom, from which we can proove the identity term, associativity, and commutativity. These “affine” expressions are really quite powerful in that respect.
– ismael
Jan 4 at 21:18






@jgon You’re totally on point: I am trying to define my types in a more “affine” manner, hence the names I gave to some of the properties of their operators. Also, it seems to me that by using such “affine” expressions for the axiomatic definition of the types, you end up with less axioms than if you don’t, which I view as a plus. For example, in the case of the multiplication, it is defined with a single axiom, from which we can proove the identity term, associativity, and commutativity. These “affine” expressions are really quite powerful in that respect.
– ismael
Jan 4 at 21:18




















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3061985%2fwhat-makes-the-pairs-of-operators-and-%25c3%25b7-%25c3%2597-so-similar%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

"Incorrect syntax near the keyword 'ON'. (on update cascade, on delete cascade,)

Alcedinidae

RAC Tourist Trophy