Why do look up tables speed things up compared to brute force?
$begingroup$
I'm currently reading up on lookup tables and efficiency. In my uni script it says the following:
For Brute Force:
Preparation time: $O(1)$
Disk space requirement: $O(1)$
Time required to crack the password: $O(2^n)$
Full lookup table:
Preparation time: $O(2^n)$
Disk space requirement: $O(2^n)$
Time required to crack the password: $O(1)$
I'm not sure I'm understanding this correctly.
As far as I understand, for Brute Force:
$O(1)$ is the time required to look up the password in my table of all possible passwords.
The disk space requirement is simply as large as needed for a list of all possible passwords, so $O(1)$ as well
My questions:
Why is the time needed to crack the password $O(2^n)$? How is it determined?
It seems I don't understand the concept of a lookup table either. As far as I see, a lookup table will simply hash the full list of possible passwords (so required disk space is larger) but what then? Why is the cracking of the password per se faster?
I think I'm missing something crucial here. Any help will be appreciated.
brute-force-attack complexity
New contributor
$endgroup$
add a comment |
$begingroup$
I'm currently reading up on lookup tables and efficiency. In my uni script it says the following:
For Brute Force:
Preparation time: $O(1)$
Disk space requirement: $O(1)$
Time required to crack the password: $O(2^n)$
Full lookup table:
Preparation time: $O(2^n)$
Disk space requirement: $O(2^n)$
Time required to crack the password: $O(1)$
I'm not sure I'm understanding this correctly.
As far as I understand, for Brute Force:
$O(1)$ is the time required to look up the password in my table of all possible passwords.
The disk space requirement is simply as large as needed for a list of all possible passwords, so $O(1)$ as well
My questions:
Why is the time needed to crack the password $O(2^n)$? How is it determined?
It seems I don't understand the concept of a lookup table either. As far as I see, a lookup table will simply hash the full list of possible passwords (so required disk space is larger) but what then? Why is the cracking of the password per se faster?
I think I'm missing something crucial here. Any help will be appreciated.
brute-force-attack complexity
New contributor
$endgroup$
1
$begingroup$
n-bit input so you need to look for $2^n$ possible passwords.
$endgroup$
– kelalaka
2 days ago
$begingroup$
@kelalaka: Thanks! I can follow that, at least.
$endgroup$
– Fang
2 days ago
add a comment |
$begingroup$
I'm currently reading up on lookup tables and efficiency. In my uni script it says the following:
For Brute Force:
Preparation time: $O(1)$
Disk space requirement: $O(1)$
Time required to crack the password: $O(2^n)$
Full lookup table:
Preparation time: $O(2^n)$
Disk space requirement: $O(2^n)$
Time required to crack the password: $O(1)$
I'm not sure I'm understanding this correctly.
As far as I understand, for Brute Force:
$O(1)$ is the time required to look up the password in my table of all possible passwords.
The disk space requirement is simply as large as needed for a list of all possible passwords, so $O(1)$ as well
My questions:
Why is the time needed to crack the password $O(2^n)$? How is it determined?
It seems I don't understand the concept of a lookup table either. As far as I see, a lookup table will simply hash the full list of possible passwords (so required disk space is larger) but what then? Why is the cracking of the password per se faster?
I think I'm missing something crucial here. Any help will be appreciated.
brute-force-attack complexity
New contributor
$endgroup$
I'm currently reading up on lookup tables and efficiency. In my uni script it says the following:
For Brute Force:
Preparation time: $O(1)$
Disk space requirement: $O(1)$
Time required to crack the password: $O(2^n)$
Full lookup table:
Preparation time: $O(2^n)$
Disk space requirement: $O(2^n)$
Time required to crack the password: $O(1)$
I'm not sure I'm understanding this correctly.
As far as I understand, for Brute Force:
$O(1)$ is the time required to look up the password in my table of all possible passwords.
The disk space requirement is simply as large as needed for a list of all possible passwords, so $O(1)$ as well
My questions:
Why is the time needed to crack the password $O(2^n)$? How is it determined?
It seems I don't understand the concept of a lookup table either. As far as I see, a lookup table will simply hash the full list of possible passwords (so required disk space is larger) but what then? Why is the cracking of the password per se faster?
I think I'm missing something crucial here. Any help will be appreciated.
brute-force-attack complexity
brute-force-attack complexity
New contributor
New contributor
edited yesterday
Journeyman Geek
1032
1032
New contributor
asked 2 days ago
FangFang
12316
12316
New contributor
New contributor
1
$begingroup$
n-bit input so you need to look for $2^n$ possible passwords.
$endgroup$
– kelalaka
2 days ago
$begingroup$
@kelalaka: Thanks! I can follow that, at least.
$endgroup$
– Fang
2 days ago
add a comment |
1
$begingroup$
n-bit input so you need to look for $2^n$ possible passwords.
$endgroup$
– kelalaka
2 days ago
$begingroup$
@kelalaka: Thanks! I can follow that, at least.
$endgroup$
– Fang
2 days ago
1
1
$begingroup$
n-bit input so you need to look for $2^n$ possible passwords.
$endgroup$
– kelalaka
2 days ago
$begingroup$
n-bit input so you need to look for $2^n$ possible passwords.
$endgroup$
– kelalaka
2 days ago
$begingroup$
@kelalaka: Thanks! I can follow that, at least.
$endgroup$
– Fang
2 days ago
$begingroup$
@kelalaka: Thanks! I can follow that, at least.
$endgroup$
– Fang
2 days ago
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
Suppose you have an $n$-bit key. Suppose further you have some reliable predicate $P(k,m)$ which decides whether a key $k$ is the key you are looking for given the reference $m$. Furthermore, suppose you have a "successor" function $F$, that takes a key and returns you the "next" key so that eventually all keys are traversed.
Now what a brute-force attack does is, given $m$, it takes a starting key $k$, evaluates $P(k,m)$ if it returns false, sets $kgets F(k)$. This will take $mathcal O(2^n)$ evaluations of $P$ and $F$. Assuming both can be computed in $mathcal O(1)$ the entire calculation takes $mathcal O(2^n)$.
Now for the look-up table. For this you simply start at some key $k$, and store the $m$ such that $P(k,m)$ yields true as a pair $(m,k)$ in a giant array (or hashmap). Now you repeat this with $F(k)$. When you then get a concrete $m$, you simply look it up in the table and find the corresponding $k$. For hashmaps and continuous arrays, this takes time $mathcal O(1)$ (ie the lookup time is independent of the number of elements). Clearly, you need storage for all the $2^n$ pairs of $(m,k)$. Now the preparation time assumes that you can find $m$ such that $P(k,m)$ is true, but this is usually possible in constant time, e.g. by evaluating a hash function or an encryption cipher.
$endgroup$
$begingroup$
For Hash table hash table runtime complexity; insert, search and delete
$endgroup$
– kelalaka
2 days ago
add a comment |
$begingroup$
Brute Force
Let's say your password was all digits (e.g. 607552)
If your password has length 1, it will take up to 10 attempts to crack:
0, 1, 2, 3 ..., 9
If your password has length 2, it will take up to 100 attempts to crack
00, 01, 02, ... 10, 11, 12, ... 20, 21, 22, ... 99
If your password has length 3 it will take up to 1000 attempts to crack
See the pattern? The number of attempts is $10^n$. It's the same idea if you are cracking a password with $n$ bits. (Since a bit can be 0 or 1, instead of 0-9, it's a power of 2 instead of 10).
Lookup Table
While a brute-force attack tries all possible passwords to see if they produce the right hash, a lookup table is a fast way to do the opposite: get a password based on a hash. It does this by hashing all the possible passwords ahead of time, and storing the results in a table.
Method 1:
Store the items in an array, sorted by hash. When given a hash, guess the position in the array of this hash, based on the numerical value of the hash. (I.e. if the numerical value is 25% of the maximum value the hash could be, then you would guess the index that is 25% of the total array size.) Your guess will be very close, because any good hash is evenly distributed. Go forwards or backwards until you find a match. This is freakishly close to $O(1)$, especially if you refine your guesses (binary-search-style) rather than just incrementing or decrementing.
The main disadvantage of this method is that initially sorting the table is asymptotically slower than generating it, namely $O(2^n times log (2^n)) = O(n 2^n)$.
Method 2:
This is essentially a hashmap, except using a library hashmap is silly when you already have a hash! Just use the first $n$ digits of your hash. That means: create an array of size $2^n$. In each slot goes a list of all hashes whose first $n$ digits equal that array index. (And their corresponding passwords.)
To look up a hash, take its first $n$ digits, and get the corresponding list out of the array. Traverse the list to find the hash. Like a hashmap, this takes $O(1)$ constant amortised time.
New contributor
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "281"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Fang is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f66461%2fwhy-do-look-up-tables-speed-things-up-compared-to-brute-force%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Suppose you have an $n$-bit key. Suppose further you have some reliable predicate $P(k,m)$ which decides whether a key $k$ is the key you are looking for given the reference $m$. Furthermore, suppose you have a "successor" function $F$, that takes a key and returns you the "next" key so that eventually all keys are traversed.
Now what a brute-force attack does is, given $m$, it takes a starting key $k$, evaluates $P(k,m)$ if it returns false, sets $kgets F(k)$. This will take $mathcal O(2^n)$ evaluations of $P$ and $F$. Assuming both can be computed in $mathcal O(1)$ the entire calculation takes $mathcal O(2^n)$.
Now for the look-up table. For this you simply start at some key $k$, and store the $m$ such that $P(k,m)$ yields true as a pair $(m,k)$ in a giant array (or hashmap). Now you repeat this with $F(k)$. When you then get a concrete $m$, you simply look it up in the table and find the corresponding $k$. For hashmaps and continuous arrays, this takes time $mathcal O(1)$ (ie the lookup time is independent of the number of elements). Clearly, you need storage for all the $2^n$ pairs of $(m,k)$. Now the preparation time assumes that you can find $m$ such that $P(k,m)$ is true, but this is usually possible in constant time, e.g. by evaluating a hash function or an encryption cipher.
$endgroup$
$begingroup$
For Hash table hash table runtime complexity; insert, search and delete
$endgroup$
– kelalaka
2 days ago
add a comment |
$begingroup$
Suppose you have an $n$-bit key. Suppose further you have some reliable predicate $P(k,m)$ which decides whether a key $k$ is the key you are looking for given the reference $m$. Furthermore, suppose you have a "successor" function $F$, that takes a key and returns you the "next" key so that eventually all keys are traversed.
Now what a brute-force attack does is, given $m$, it takes a starting key $k$, evaluates $P(k,m)$ if it returns false, sets $kgets F(k)$. This will take $mathcal O(2^n)$ evaluations of $P$ and $F$. Assuming both can be computed in $mathcal O(1)$ the entire calculation takes $mathcal O(2^n)$.
Now for the look-up table. For this you simply start at some key $k$, and store the $m$ such that $P(k,m)$ yields true as a pair $(m,k)$ in a giant array (or hashmap). Now you repeat this with $F(k)$. When you then get a concrete $m$, you simply look it up in the table and find the corresponding $k$. For hashmaps and continuous arrays, this takes time $mathcal O(1)$ (ie the lookup time is independent of the number of elements). Clearly, you need storage for all the $2^n$ pairs of $(m,k)$. Now the preparation time assumes that you can find $m$ such that $P(k,m)$ is true, but this is usually possible in constant time, e.g. by evaluating a hash function or an encryption cipher.
$endgroup$
$begingroup$
For Hash table hash table runtime complexity; insert, search and delete
$endgroup$
– kelalaka
2 days ago
add a comment |
$begingroup$
Suppose you have an $n$-bit key. Suppose further you have some reliable predicate $P(k,m)$ which decides whether a key $k$ is the key you are looking for given the reference $m$. Furthermore, suppose you have a "successor" function $F$, that takes a key and returns you the "next" key so that eventually all keys are traversed.
Now what a brute-force attack does is, given $m$, it takes a starting key $k$, evaluates $P(k,m)$ if it returns false, sets $kgets F(k)$. This will take $mathcal O(2^n)$ evaluations of $P$ and $F$. Assuming both can be computed in $mathcal O(1)$ the entire calculation takes $mathcal O(2^n)$.
Now for the look-up table. For this you simply start at some key $k$, and store the $m$ such that $P(k,m)$ yields true as a pair $(m,k)$ in a giant array (or hashmap). Now you repeat this with $F(k)$. When you then get a concrete $m$, you simply look it up in the table and find the corresponding $k$. For hashmaps and continuous arrays, this takes time $mathcal O(1)$ (ie the lookup time is independent of the number of elements). Clearly, you need storage for all the $2^n$ pairs of $(m,k)$. Now the preparation time assumes that you can find $m$ such that $P(k,m)$ is true, but this is usually possible in constant time, e.g. by evaluating a hash function or an encryption cipher.
$endgroup$
Suppose you have an $n$-bit key. Suppose further you have some reliable predicate $P(k,m)$ which decides whether a key $k$ is the key you are looking for given the reference $m$. Furthermore, suppose you have a "successor" function $F$, that takes a key and returns you the "next" key so that eventually all keys are traversed.
Now what a brute-force attack does is, given $m$, it takes a starting key $k$, evaluates $P(k,m)$ if it returns false, sets $kgets F(k)$. This will take $mathcal O(2^n)$ evaluations of $P$ and $F$. Assuming both can be computed in $mathcal O(1)$ the entire calculation takes $mathcal O(2^n)$.
Now for the look-up table. For this you simply start at some key $k$, and store the $m$ such that $P(k,m)$ yields true as a pair $(m,k)$ in a giant array (or hashmap). Now you repeat this with $F(k)$. When you then get a concrete $m$, you simply look it up in the table and find the corresponding $k$. For hashmaps and continuous arrays, this takes time $mathcal O(1)$ (ie the lookup time is independent of the number of elements). Clearly, you need storage for all the $2^n$ pairs of $(m,k)$. Now the preparation time assumes that you can find $m$ such that $P(k,m)$ is true, but this is usually possible in constant time, e.g. by evaluating a hash function or an encryption cipher.
edited 2 days ago
kelalaka
6,23022142
6,23022142
answered 2 days ago
SEJPM♦SEJPM
28.7k555134
28.7k555134
$begingroup$
For Hash table hash table runtime complexity; insert, search and delete
$endgroup$
– kelalaka
2 days ago
add a comment |
$begingroup$
For Hash table hash table runtime complexity; insert, search and delete
$endgroup$
– kelalaka
2 days ago
$begingroup$
For Hash table hash table runtime complexity; insert, search and delete
$endgroup$
– kelalaka
2 days ago
$begingroup$
For Hash table hash table runtime complexity; insert, search and delete
$endgroup$
– kelalaka
2 days ago
add a comment |
$begingroup$
Brute Force
Let's say your password was all digits (e.g. 607552)
If your password has length 1, it will take up to 10 attempts to crack:
0, 1, 2, 3 ..., 9
If your password has length 2, it will take up to 100 attempts to crack
00, 01, 02, ... 10, 11, 12, ... 20, 21, 22, ... 99
If your password has length 3 it will take up to 1000 attempts to crack
See the pattern? The number of attempts is $10^n$. It's the same idea if you are cracking a password with $n$ bits. (Since a bit can be 0 or 1, instead of 0-9, it's a power of 2 instead of 10).
Lookup Table
While a brute-force attack tries all possible passwords to see if they produce the right hash, a lookup table is a fast way to do the opposite: get a password based on a hash. It does this by hashing all the possible passwords ahead of time, and storing the results in a table.
Method 1:
Store the items in an array, sorted by hash. When given a hash, guess the position in the array of this hash, based on the numerical value of the hash. (I.e. if the numerical value is 25% of the maximum value the hash could be, then you would guess the index that is 25% of the total array size.) Your guess will be very close, because any good hash is evenly distributed. Go forwards or backwards until you find a match. This is freakishly close to $O(1)$, especially if you refine your guesses (binary-search-style) rather than just incrementing or decrementing.
The main disadvantage of this method is that initially sorting the table is asymptotically slower than generating it, namely $O(2^n times log (2^n)) = O(n 2^n)$.
Method 2:
This is essentially a hashmap, except using a library hashmap is silly when you already have a hash! Just use the first $n$ digits of your hash. That means: create an array of size $2^n$. In each slot goes a list of all hashes whose first $n$ digits equal that array index. (And their corresponding passwords.)
To look up a hash, take its first $n$ digits, and get the corresponding list out of the array. Traverse the list to find the hash. Like a hashmap, this takes $O(1)$ constant amortised time.
New contributor
$endgroup$
add a comment |
$begingroup$
Brute Force
Let's say your password was all digits (e.g. 607552)
If your password has length 1, it will take up to 10 attempts to crack:
0, 1, 2, 3 ..., 9
If your password has length 2, it will take up to 100 attempts to crack
00, 01, 02, ... 10, 11, 12, ... 20, 21, 22, ... 99
If your password has length 3 it will take up to 1000 attempts to crack
See the pattern? The number of attempts is $10^n$. It's the same idea if you are cracking a password with $n$ bits. (Since a bit can be 0 or 1, instead of 0-9, it's a power of 2 instead of 10).
Lookup Table
While a brute-force attack tries all possible passwords to see if they produce the right hash, a lookup table is a fast way to do the opposite: get a password based on a hash. It does this by hashing all the possible passwords ahead of time, and storing the results in a table.
Method 1:
Store the items in an array, sorted by hash. When given a hash, guess the position in the array of this hash, based on the numerical value of the hash. (I.e. if the numerical value is 25% of the maximum value the hash could be, then you would guess the index that is 25% of the total array size.) Your guess will be very close, because any good hash is evenly distributed. Go forwards or backwards until you find a match. This is freakishly close to $O(1)$, especially if you refine your guesses (binary-search-style) rather than just incrementing or decrementing.
The main disadvantage of this method is that initially sorting the table is asymptotically slower than generating it, namely $O(2^n times log (2^n)) = O(n 2^n)$.
Method 2:
This is essentially a hashmap, except using a library hashmap is silly when you already have a hash! Just use the first $n$ digits of your hash. That means: create an array of size $2^n$. In each slot goes a list of all hashes whose first $n$ digits equal that array index. (And their corresponding passwords.)
To look up a hash, take its first $n$ digits, and get the corresponding list out of the array. Traverse the list to find the hash. Like a hashmap, this takes $O(1)$ constant amortised time.
New contributor
$endgroup$
add a comment |
$begingroup$
Brute Force
Let's say your password was all digits (e.g. 607552)
If your password has length 1, it will take up to 10 attempts to crack:
0, 1, 2, 3 ..., 9
If your password has length 2, it will take up to 100 attempts to crack
00, 01, 02, ... 10, 11, 12, ... 20, 21, 22, ... 99
If your password has length 3 it will take up to 1000 attempts to crack
See the pattern? The number of attempts is $10^n$. It's the same idea if you are cracking a password with $n$ bits. (Since a bit can be 0 or 1, instead of 0-9, it's a power of 2 instead of 10).
Lookup Table
While a brute-force attack tries all possible passwords to see if they produce the right hash, a lookup table is a fast way to do the opposite: get a password based on a hash. It does this by hashing all the possible passwords ahead of time, and storing the results in a table.
Method 1:
Store the items in an array, sorted by hash. When given a hash, guess the position in the array of this hash, based on the numerical value of the hash. (I.e. if the numerical value is 25% of the maximum value the hash could be, then you would guess the index that is 25% of the total array size.) Your guess will be very close, because any good hash is evenly distributed. Go forwards or backwards until you find a match. This is freakishly close to $O(1)$, especially if you refine your guesses (binary-search-style) rather than just incrementing or decrementing.
The main disadvantage of this method is that initially sorting the table is asymptotically slower than generating it, namely $O(2^n times log (2^n)) = O(n 2^n)$.
Method 2:
This is essentially a hashmap, except using a library hashmap is silly when you already have a hash! Just use the first $n$ digits of your hash. That means: create an array of size $2^n$. In each slot goes a list of all hashes whose first $n$ digits equal that array index. (And their corresponding passwords.)
To look up a hash, take its first $n$ digits, and get the corresponding list out of the array. Traverse the list to find the hash. Like a hashmap, this takes $O(1)$ constant amortised time.
New contributor
$endgroup$
Brute Force
Let's say your password was all digits (e.g. 607552)
If your password has length 1, it will take up to 10 attempts to crack:
0, 1, 2, 3 ..., 9
If your password has length 2, it will take up to 100 attempts to crack
00, 01, 02, ... 10, 11, 12, ... 20, 21, 22, ... 99
If your password has length 3 it will take up to 1000 attempts to crack
See the pattern? The number of attempts is $10^n$. It's the same idea if you are cracking a password with $n$ bits. (Since a bit can be 0 or 1, instead of 0-9, it's a power of 2 instead of 10).
Lookup Table
While a brute-force attack tries all possible passwords to see if they produce the right hash, a lookup table is a fast way to do the opposite: get a password based on a hash. It does this by hashing all the possible passwords ahead of time, and storing the results in a table.
Method 1:
Store the items in an array, sorted by hash. When given a hash, guess the position in the array of this hash, based on the numerical value of the hash. (I.e. if the numerical value is 25% of the maximum value the hash could be, then you would guess the index that is 25% of the total array size.) Your guess will be very close, because any good hash is evenly distributed. Go forwards or backwards until you find a match. This is freakishly close to $O(1)$, especially if you refine your guesses (binary-search-style) rather than just incrementing or decrementing.
The main disadvantage of this method is that initially sorting the table is asymptotically slower than generating it, namely $O(2^n times log (2^n)) = O(n 2^n)$.
Method 2:
This is essentially a hashmap, except using a library hashmap is silly when you already have a hash! Just use the first $n$ digits of your hash. That means: create an array of size $2^n$. In each slot goes a list of all hashes whose first $n$ digits equal that array index. (And their corresponding passwords.)
To look up a hash, take its first $n$ digits, and get the corresponding list out of the array. Traverse the list to find the hash. Like a hashmap, this takes $O(1)$ constant amortised time.
New contributor
New contributor
answered yesterday
ArteliusArtelius
1211
1211
New contributor
New contributor
add a comment |
add a comment |
Fang is a new contributor. Be nice, and check out our Code of Conduct.
Fang is a new contributor. Be nice, and check out our Code of Conduct.
Fang is a new contributor. Be nice, and check out our Code of Conduct.
Fang is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Cryptography Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f66461%2fwhy-do-look-up-tables-speed-things-up-compared-to-brute-force%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
n-bit input so you need to look for $2^n$ possible passwords.
$endgroup$
– kelalaka
2 days ago
$begingroup$
@kelalaka: Thanks! I can follow that, at least.
$endgroup$
– Fang
2 days ago