💾 Archived View for rawtext.club › ~jmq › news › 2019-02-22-calculus-sequencing.gmi captured on 2023-09-08 at 17:58:19. Gemini links have been rewritten to link to archived content
⬅️ Previous capture (2023-01-29)
-=-=-=-=-=-=-
My calculus students entered their fifth week of the semester having seen limits, asymptotes, average and instantaneous rates of change, and the power rule for differentiation. So far we could have been following either the Barnett, Ziegler, Byleen textbook (BZB, as currently assigned), or the Goldstein, Lay, Schneider textbook (GLS) that I used when I last taught this course. This week we finally have to choose which direction to go next: (i) an early exposure to interpreting the derivative, in word problems, tables, and graphs; (ii) further expansion of our library of functions with differentiation formulas for exp and log; or (iii) more symbolic manipulation via the product, quotient, and chain rules. Approach (i) is favored by GLS and agrees with Sanjay Rai's admonition to get students to think about the meaning of the mathematical objects they encounter. Approach (iii) is more compatible with the BZB text, but might follow too quickly on the heels of the basic differentiation techniques that some students still need more practice to master.
Approach (ii) is traditionally considered more appropriate for STEM majors, and would not even be a contender were it not for a curious legacy from Mary Kay Abbey's tenure as a math professor in our department. Thanks to Professor Abbey, even our intermediate algebra for liberal arts course (MATH 093, formerly MA 097) was structured to give students early exposure to transcendental functions. This innovation was intended to allow cumulative review throughout the semester of what might otherwise have received only a two- or three-week treatment late in the semester when students are unable to process new material as quickly. Although none of my calculus students this semester are coming directly from an intermediate algebra course so structured (seeing as MATH 093 was retired from the catalog last year), the rationale for early transcendentals is just as valid in the calculus context as it was in intermediate algebra, and perhaps even more so when business majors make up a significant proportion of the class. The Harvard calculus text (Hughes-Hallett et al.) makes a similar appeal for early transcendentals, so I feel well-backed by precedent in choosing this direction for week 5.
More radical still would be the choice to follow the Harvard calculus text even in the early introduction of the definite integral. While the historical development of calculus supports this parallel introduction of the two key problems (slope of a curve at a point, and finding areas by exhaustion), the BZB homework exercises if assigned in this order might produce far too much confusion. Still, if my next chance to teach elementary applied calculus is several years away, I can't pass up the opportunity to collect data on how well this approach fares with the latest cohort of students.
Accordingly, on Monday I gave a rough overview of the area problem and the construction of Riemann sums. Then I sketched out a scaling argument to show why the function that accumulates area under the unit hyperbola xy=1 satisfies the same key properties of logarithm functions. We worked through a few review exercises on algebraic properties of logarithms before moving on to the differentiation formulas.
Today we rewrote on the board the basic differentiation formulas for powers, exponential, and logarithmic functions, and then set ourselves the long-run task of differentiation a composition. Along the way, we would derive the connection between logarithms, relative rates of change, and the product rule. Borrowing from Stephen Maurer's MAA article on hat derivatives, a motivating question to frame this investigation went as follows: "In country X, the population is growing at 2% per year, and the per capita beef consumption is growing at 3% per year. How fast is the total beef consumption changing?"
Most students seemed unfazed by the use of local linearity to expand C(t+1) and P(t+1), the per capita consumption and the population, respectively, one year from now. I had hoped that the use of the distributive law would be similarly trivial, but a couple questions on that manipulation did arise. It's encouraging that this group had enough self-advocacy skills to not suppress these questions out of fear of embarrassment in front of their peers.
The use of MATLAB animations of composite functions, to explain the chain rule in terms of multiplying the slopes of two different curves, did not go over too well. Too many students seem to lack the skill of reading carefully the axes of a graph, as I'll no doubt learn again when I ask them to interpret the graph of f' to locate the relative extrema of f. Adapting the context-free MATLAB animations on Youtube, to draw by hand the graphs of Population versus time and ln(P) versus P, in the context of national beef consumption and a relative population growth rate of 2%, seemed to make the connection more intelligible.
I have to be careful to distinguish between a narrative that is everywhere locally intelligible, versus one that is globally intelligible. While each individual step in this development towards the chain rule and product rule might make sense on its own, the sheer number of new ideas presented this week is a significant obstacle in seeing the big picture. In subsequent weeks I'll probably have to address a number of questions on the homework assignments because of our limited class time to demonstrate examples of all the techniques.