||2 years ago|
|.gitignore||2 years ago|
|Makefile||2 years ago|
|README.md||2 years ago|
|backup.c||2 years ago|
|bigint.c||2 years ago|
|bigint.h||2 years ago|
|bigint_compare.c||2 years ago|
|bigint_pow.c||2 years ago|
|in.txt||2 years ago|
|list.c||2 years ago|
|list.h||2 years ago|
|main.c||2 years ago|
|parse.c||2 years ago|
|parse.h||2 years ago|
|queue.c||2 years ago|
|queue.h||2 years ago|
|stack.h||2 years ago|
Build and run with
make 1>>/dev/null && ./calc in.txt only the first line of
in.txt is calculated.
Ryan Stewart and Christopher Mahoney
For assignment two we wrote 2 parts. A tokenizer that uses the shunting yard algorithm to parse inputs and a bigint library that handles arithmetic.
The gist of how bigints are implemented is that we use a doubly linked list of
uint32_t's. Then we limit each node to 108-1. (99999999). This way, when perform arithmetic operations we can cast each node to a
uint64_t. Since (232-1) * (232-1) < (264-1) we'll never have overflow. We decided to limit each node to 108-1 so that we can efficiently print bigints with
From here we implemented the arithmetic operations as if you would do them in grade school. However, instead of memorizing the results of operators for single digits we can use the ALU to efficiently give us results of 8 digit by 8 digit operations. That's to say if n is the length of the inputs addition and subtraction are O(n) and multiplication is O(n2).
The bigint library natively supports negative numbers and the tokenizer can parse most situations with negatives. There are some strange situations (especially with parenthesis) where parsing negatives can fail. For example, something like
(-3+5)* -3 will work, but
(-3+5) * (-3) would break.
We also implemented exponentiation in O(log(n)) multiplications. Using a combination of Addition-chain exponentiation and modified Exponentiation by squaring. On average each digit in the exponent is processed in 7 multiplications. The best case is 4 if the digit is 0 and worst is 9 if the digit is 7 or 9.
We decided to write our own tokenizer so we could get more familiar with that aspect of CS. If you read into the tokenizer you'll see it is bounded in the length of the strings it can load. In practice you would need Zettabytes of memory to ever load such a string and we could always bump up to a 128 bit number and you'd need 565000 mols of ram. It could also have been written to work around this, but under the time constraint that wasn't logical to do. Either way, we also included a modified version of the original finite calculator tokenizer which you can compile with
gcc backup.c bigint.c bigint_compare.c -o backup. backup.c does not support exponentiation or reading negatives from stdin.