In the last session, we went over some basic stack and queue concepts. Stacks and queues are abstract data types meaning that they declare an interface of functions to manipulate but not the underlying implementation. Stacks are Last in first out (lifo): the last inserted (pushed) element will be the first to be extracted (popped). A stack implementation should define two operations: stack.push(element_type x) and element_type stack.pop(). Think of a stack of dishes in a buffet. You can push new plates onto the stack but when you grab a plate from the stack, it will be the last plate that was pushed onto it. Some variants of stacks will have three functions: push(element_type x), void pop(), and element_type peek() where the role of pop() is split into a function that returns the value but does not remove that element from the stack and one that just removes the element from the stack. A queue is first-in-first-out (fifo) data structure with functions: queue.enqueue(element_type x) and element_type queue.dequeue(). The running times of these operations are wholly dependent on how the stack or queue is implemented. Typically, we like O(1) (constant-time) push, pop, enqueue, and dequeue, but that isn't always achievable if we are constrained by space or allocation requirements.
This is part 1 of a two part series. Skip over to part 2 you'd like . For coding interviews, we are interested in gauging the asymptotic efficiency of algorithms both in terms of running time and space. The formal study of analysis of algorithms is called complexity theory, a rich field with fascinating and complicated math. For interviews, we only need a few basic concepts. Asymptotic efficiency is concerned with how the running time or memory requirements of an algorithm grows with the input size, so it is intimately concerned with how well algorithms scale to larger inputs. This is important in Computer Science and in practice because whereas some algorithms work well enough for small inputs of say < 10 inputs, the running time and space grows far faster than the input size and thus large inputs of say 10s to millions of inputs becomes impractical (usually meaning taking hours or even years of execution time). Consider sorting. Say for the sake of argument that sorting
Comments
Post a Comment