Posts

Showing posts from February, 2026

CST 370: Week 7

This week we learned two new sorting techniques, counting sort and radix sort, both of which sort a set of inputted values without needing to compare them to one another. The counting sort finds the frequency of each value within the set then determines how they must be distributed by calculating the accumulation of the frequencies from the lowest end of the range to the highest end. Instead of evaluating each value as a whole, the radix sort involves looking at the individual digits of each value and distributes them based on a range of 0 to 9, repeating the process until each value has been distributed from their least significant digit (LSD) to their first (or leading) digit. We also learned about dynamic programming, which is where we can save computations when solving a problem that involves recalculating it's subproblems multiple times by saving the solutions to those subproblems in an array that can be referred to later rather than recalculating. We learned to apply this tec...

CST 370: Week 6

 This week we learned about two more trees: AVL and 2-3, as well as about heaps and hashing. AVL trees are balanced binary search trees that have a difference between the left and right subtrees of -1, 0, or 1. This difference is referred to as the balance factor (or balancing factor) and is specifically defined by the height of the left subtree minus the height of the right subtree. 2-3 trees on the other hand have subtrees with the same height and each node can have one or two values, where most other trees we've worked with have only one value per node. Heaps are complete binary trees, which consist of every level but the last needing to be completely filled and the last level orienting the existing nodes as far to the left as possible. Heaps are also defined by how the parent nodes relate to the child nodes, where each node in max heaps are greater than or equal to the numbers of its children and each node in min heaps are less than or equal to the numbers of its children. We l...

CST 370: Week 5

 This week we learned about the quicksort algorithm, binary tree traversal, and decrease and conquer algorithms. The quicksort algorithm uses the divide and conquer approach like merge sort, but partitions based on a particular value referred to as the pivot. The pivot is typically the first element within a given array, but sometimes we apply the median of three rule instead, which involves comparing and sorting the order of the first, middle, and last elements with the array and selecting the middle one as the pivot. As for binary tree traversal, we learned the three order variations using for moving through a tree (preorder, inorder, and postorder) and how to determine a tree's height (the length of the longest path from the root to the leaf node in the tree). In terms of decrease and conquer, the technique is based on the idea that smaller problems are solved more easily and we learned there are three variations: decrease by a constant, decrease by a constant factor, and variab...

CST 370: Week 4

This week we only learned about merge sorting, which involves using a divide-and-conquer approach to sorting the contents of an array. Given an unsorted array, merge sorting requires dividing the array into halves, referred to as subarrays. The subarrays are continuously divided into smaller subarrays until we are left with individual elements. Each of those elements are then compare against another and placed within a new array in a sorted fashion. Those arrays are then compared and combined with others until we are left with an array in similar size and content of the original one, though now sorted.