The "stories" in the writing seem good. If you want to learn by reading somebody explain how to solve a problem, I think this is a good text for you.
I don't think there are enough exercises... with a title of "problem solving..." I was expecting more focus on (challenging?) problems in the exercises. If you want to learn by thinking about how to solve problems, this is not the text for you.
I could not find a comparison of the orders of various operations on ADT's. I thought I would find something like that in Chapter 3. I think that omission makes it hard to understand the big picture.
Details seem raw:
* The table on page 9 shows the Pythonic not equals operator is `=!`.
* 2.7 Programming Exercise 4 should really be specifying linear _in what parameter(s)_ for clarity. The operation is O(nk) presumably. Exercise 5 assumes that n = len(data) from Exercise 4. The claim that O(n log(n)) < O(n k) depends on k and n, but that isn't clearly stated.
* 3.14 Programming Exercise 7: this exercise has in mind only one particular solution. Does not a double ended queue have O(1) enqueue and dequeue?
You might find the Debug Visualizer extension for VS Code helpful to visualize algorithms and datastructures [1].
It kind of works with python, but works best with Javascript/TypeScript.
I'm not impressed with the dynamic programming part, which is what I'm trying to learn right now.
It simply lists it under recursion and after skimming, I see no mention of bottom up DP (iteratively) and there's only one problem about it.
I'm still looking for a good resource, but this one gave me quite a bit of insight [1]. Since it reuses the same problem over and over again and tweaks it, which gave me a better view of DP.
"Algorithm Design" by Kleinberg & Tardos is pretty good. Chapter 6 is its Dynamic Programming chapter, and there are 9 examples in the chapter with pseudocode and analysis.
Some of its dynamic programming examples: Weighted Interval Scheduling, Segmented Least Squares, Subset Sums / Knapsacks, RNA Secondary Structure, Sequence Alignment (aka: Diff), Shortest Path in Graphs.
All in just a ~50 page chapter.
The other chapters on Network Flow, Greedy Algorithms are also excellent.
"Algorithm Design" is an advanced text, but that's how it covers more material than easier textbooks since it won't waste your time on the beginner level stuff.
-------------
Honestly, given the beginner-nature of the textbook here, I think they're doing a decent job. Dynamic programming isn't as fundamental as the stack / queue / sorting / searching etc. etc. that's being discussed in "Problem Solving with Algorithms and Data Structures".
Dynamic programming is definitely something that should be reserved for more advanced textbooks (with maybe, at best, an introduction to the subject at this level).
> Honestly, given the beginner-nature of the textbook here, I think they're doing a decent job. Dynamic programming isn't as fundamental as the stack / queue / sorting / searching etc. etc. that's being discussed in "Problem Solving with Algorithms and Data Structures".
Ah, I think I set my expectations wrong with regards to the level.
What I love the most about this is the explicit showing of how to build the various data structures. Many DS courses will make implementing the structure the homework assignment, but the reality is the algorithms have been implemented to death online and the real benefit comes from analyzing which is most optimal. It restricts the possibilities for automatic grading, but students at this level should be progressing towards the "Evaluate" phase of Bloom's hierarchy any way. They should be getting practice assessing the benefits between data structures, not trying to reinvent the wheel.
That said, many job interviews expect people to be able to implement data structures on the fly. However, I'd say that's more a limitation of the interview process rather than an issue with the textbook.
You can solve them with pen and paper. And yes, you are using algorithms then too but not in the same way as on a computer.
But anyways I think you read the title kind of opposite. Instead I think it is meant to reflect that algorithms and data structures, which in general have many uses, are here being used for the purpose of problem solving. I.e. to let readers know that it is specific.
I always felt Math use data structures and algorithms. Properties of whatever you manipulate let you have some sort of known structure with proven theorems, then you use these theorems one after an other until you get to where you maybe "want" to be .
It seems to me that mathematical objects are often more abstract than data structures are required to be. For example, you can reason about a triangle without specifying how you would represent it. You could represent a triangle as a list of three points, or as a list of side lengths and angles or even a list of angles which forces you to reason about the class of similar triangles with those angles. In fact, to represent certain algebraic structures is an entire subfield of algebra (representation theory) distinct from the parts of algebra that reason about those structures without trying to represent them.
Representation theory is about study of functors between category of vector spaces and category of interest, which is quite different from the problem of different representations of a triangle.
Different how? They are both ways of making abstract descriptions of an object more concrete, which is analogous to the job of a programmer designing a data structure to concretely describe an abstractly specified computational object.
My point is just that in math, you can reason about an object even if you can't concretely represent it. For example, you can figure out properties of a number or function that you don't know enough about to write down. But in programming you can't use an object unless you have a concrete representation of it, and often getting that concrete representation (in both math and programming) requires a kind of work which is quite different than the work of abstractly describing or specifying it. So it's maybe not always appropriate to call an abstract description of a mathematical object a "data structure", since often it's more analogous to (say) the interface of an object than it's implementation.
The "stories" in the writing seem good. If you want to learn by reading somebody explain how to solve a problem, I think this is a good text for you.
I don't think there are enough exercises... with a title of "problem solving..." I was expecting more focus on (challenging?) problems in the exercises. If you want to learn by thinking about how to solve problems, this is not the text for you.
I could not find a comparison of the orders of various operations on ADT's. I thought I would find something like that in Chapter 3. I think that omission makes it hard to understand the big picture.
Details seem raw:
* The table on page 9 shows the Pythonic not equals operator is `=!`.
* 2.7 Programming Exercise 4 should really be specifying linear _in what parameter(s)_ for clarity. The operation is O(nk) presumably. Exercise 5 assumes that n = len(data) from Exercise 4. The claim that O(n log(n)) < O(n k) depends on k and n, but that isn't clearly stated.
* 3.14 Programming Exercise 7: this exercise has in mind only one particular solution. Does not a double ended queue have O(1) enqueue and dequeue?