DS from S: Chapter 19
Deep Learning
Thought before starting
Learned a lot last chapter, looking forward to this one!
Thoughts while reading
What I learned
Apparently this used to be just a way of dealing with “deep” neural networks (i.e., more than one hidden layer) but now encompasses a lot of architectures
What I liked
Oh, it’s been awhile since I’ve worked with tensors!
Recursive programming is always fun. Author’s
tensor_sum()is what I’m thinking of here.Not data science, but “zeros” is the plural of “zero” in the US, while “zeroes” is the plural in the UK, apparently.
What I disliked
Apparently “tensor” just means multidimensional arrays of numbers and not tensor fields.
Really wish the author would explain his names. The
is_1d()function threw me off at first because I thought the “one” was an “L”.This has come up in previous chapters, but I don’t love that the author uses `
input` (a key word in Python) as a variable.Author mentions in this chapter that the sigmoid function has gone out of style. Not sure why he didn’t start us of with tanh (or something similar).
Other thoughts
I wonder when I’ll learn how many layers of multi-layer neural networks I’ll want to use

