This is a solid and up-to-date book in an area that has seen exponential change. Before we go any further, a disclaimer: I have been a proud owner of the authors’ famous book Computer architecture: a quantitative approach  for over 20 years.
This book is 565 pages long, not including appendices and historical sections available online. It is targeted at people who want to understand basic computer organization--how a system works and performs. This new edition, as the title suggests, uses the RISC-V instruction set in most examples and exercises. Each chapter ends with a series of exercises rated by level of difficulty. Furthermore, each chapter contains “fallacies and pitfalls” and a historical section, which nicely put the topic just covered in a broader context. Because of its nice pace, well-explained text, and organization, it works well both for learning about computer organization and as a reference for practitioners.
Chapter 1 lays the foundation for the next chapters by explaining how computer architecture evolved in the last five decades: what changed and what did not. It also provides a first peek “under the covers,” as readers are introduced to assembly and machine language and how a high-level routine in either C or Java gets translated into machine language.
Chapter 2 digs a little deeper on how to manipulate constants, variables, subroutine parameter passing, and synchronization among concurrent processes or threads. It also covers pseudo instructions and how to address complex data structures.
Chapter 3 dives into all the arithmetic operations for both integer and floating point. The four topics exploited in this chapter’s “fallacies and pitfalls” are particularly interesting for being at the same time specific and far-reaching.
Chapter 4 takes a closer look at the processor, from implementation principles and components all the way to the details of a pipelined architecture. It also touches on instruction-level parallelism beyond pipelining.
Chapter 5 is all about memory. It covers memory hierarchy and all the phenomena that follow, such as locality, hit and miss rates, addressing, block sizes, and so on. It also talks about virtual memory, and briefly hints at input/output (I/O) as the last-level memory.
Finally, chapter 6 presents different flavors of parallel processing, and all the issues and opportunities arising from breaking a big problem into smaller pieces, from architectures to programming paradigms, loosely and tightly coupled, multicores and clusters.
The preface gives a web address where the online information should be available (textbooks.elsevier.com/9780128122754). I’ve tried accessing it a few times, in June and July of 2017, but got error 404 (file or directory not found). Searching the editor’s site (textbooks.elsevier.com) will take users to the correct location for the online materials. They include the historical perspective sections and three appendices, two of which are authored by other people. There is an accompanying simulator to the RISC-V instruction set. The link to risk.org is included on the Elsevier site. There, readers will find instructions on how to install the simulator on a Linux box. To access the instructor materials, however, one must first register and create an account. I did not.
My one issue with this book is the authors’ decision to alter an ages-long tradition of calling a kilobyte 1024 bytes (210). In this book, a kilobyte is now called a kibibyte. Really?! Figure 1.1 (p. 6) lists new names for KB, MB, GB, all the way to YB. That left me wondering whether a byte still consisted of eight bits. I had to wait until page 69 for confirmation that a byte, at least for now, still contains eight bits.
More reviews about this item: Amazon