CST 370: Week 2

 This week we learned about using asymptotic notations for analyzing both non-recursive and recursive algorithms. The three notations used for algorithm analysis are referred to as Big Oh (O(f(n))), Big Theta (Θ(f(n))), and Big Omega (Ω(f(n))). These notations are used to represent an algorithm's efficiency based on their respective order of growth in the best-case scenario (Big Omega - Ω), worst-case scenario (Big Oh - O), and/or when all cases have the same growth (Big Theta - Θ). For non-recursive algorithms, the low order terms and constant coefficients must be eliminated to identify the order of growth, while in recursive algorithms, the recurrence relation and initial condition must be defined first so they can be applied during the backward substitution process used to reach the order of growth. In relation to algorithm analysis, we also learned about the brute-force approach, which involves solving a problem in the simplest way. 

Comments

Popular posts from this blog

CST 334: Week 4

Week 4 Learning Journal Post

CST 363: Week 5