Y12 Unit 0 - Class Structure
Y12 Unit 1 - Computational Thinking
Y12 Unit 2 - Networks
Y12 Unit 3 - OOP
Y12 Unit 4 - System Fundamentals
Abstract Data Structures (HL) Year 13 Unit

Linked List Operations

Adding

An item can be added to the linked list at the beginning, end or at some arbitrary point. When adding a new Item to say a LinkedList<Car> we must first create a new Node, place the Car in the Node’s data and then link the Node correctly.

Removing

Removing can be simple. If there is only one Node in the Linked List, just make the head == Null.

Removing can be complicated. If we want to remove some data and its Node is in the middle of the linked list, we have to make sure we don’t unlink the wrong Node. If node 4 is being deleted we need to link Node 3 to Node 5. What if there is no Node 5? Then Node 3 can just point to null.

Modifying and Searching

LinkedLists, like ArrayLists, are quite slow at finding elements. We have to look at each Node, find if it contains the data we’re looking for, and then modify its contents. If the element happens to be on the last Node, then we had to look at every single node, which means that this Data Structure has a worst case scenario of SLOW.

Measuring Slowness

“Premature optimization is the root of all evil in Computer Science.”

Donald Knuth

Many programmers get stuck on creating the absolute most efficient algorithms, even though they have not measured whether those optimizations will make a difference. Interestingly enough, many times these “optimizations” lead to even slower code. These enhancements are not noticeable because the scale at which we are working is very small and optimizations will show when large amounts of data are being processed. (there are certain rules that, when broken, will create slow code even with small amounts of data) To measure algorithms we use something called asymptotic notation. There are many ways to measure algorithms asymptotically but Big O notation is the most common in Data Structures classes. It refers to the worst-case scenario of a Data Structure. For example, because a linked list requires, at worst-case scenario, to look at every element when searching, if there are N elements in the list we will perform N operations. Linked Lists then have a time complexity of O(n).

Some Data Structures have better time complexities for search and others have worst. HashMaps have a time complexity of O(1), meaning that no matter what you are searching for, it will only take 1 operation to accomplish the task. Really bad algorithms can have a time complexity of O(n^2) or O(n^n), meaning that as the data set increases in size, the amount of time it takes gets worse!

We’ll look at time complexities for other data structures as we encounter them, and you’ll be required to know what, and why, is the time complexity for each.