Part 8 - Interfaces and Memory

Posted on Feb 19, 2023
(Last updated: May 26, 2024)

In this part we’ll cover interfaces, interconnects as well as memory.

Interface timing

In digital circuits, we often want to send data, from a sender, to a receiver.

How can we achieve this data passing from one module to another?

The answer is:

  • Open loop
  • Flow Control
  • Serialized

In an open loop, we either have so that it’s always “valid” to send data, or periodically.

In flow control, as the name suggests, the data-flow is controlled. The sender needs to output a “valid” signal, and the receiver needs to send a “ready” signal.

Then depending on what kind of control we have, either Push flow or Pull flow.

  • Push flow
    • Assume receiver always ready
  • Pull flow
    • Assume sender always has valid data

When these signals align in a clock pulse, we can send the data.

Serialization, is the idea that we split up the data into smaller chunks. When the frame signal is active, it means a serial frame is about to start, or in layman terms, the beginning of a new data pack.

Flow control can be at either frame level, or word level.

Interconnects

In circuits, there is also a need that clients can communicate with each other - how do we achieve this?

The common solution is using a so-called bus. The bus is a shared resource which clients can communicate with each other.

But there are other solutions:

  • Crossbar switch
  • Interconnection networks

There are a lot of different factors that impact what solution you pick:

  • Cost
  • Scalability
  • # of connections

Are just a few.

Memory

We’ve already seen and worked with memory, so let’s refresh what we’ve covered:

Uses:

  • Data & program storage
  • General purpose registers
  • Buffering
  • Lookup tables
  • Combinational Logic implementation
  • Whenever a large collection of state elements is required.

Types:

  • RAM - random access memory
  • ROM - read only memory
  • EPROM, FLASH - electrically programmable read only memory

What we usually mean by memory though is, many addressable fixed size locations.

$n$ bits allow the addressing of $2^n$ memory locations. Example: 24 bits can address $2^{24}$ = 16,777,216 locations

If each location holds 1 byte (= 8 bits) then the memory is 16 MB. If each location holds one word (32 bits = 4 bytes) then it is 64 MB.

Computers are either byte or word addressable, meaning that each memory location holds either 8 bits (1 byte), or a full standard word for that computer architecture.

  • Each bit
    • Is a gated D-latch
  • Each location
    • Consists of $w$ bits, either $w = 8$ or $w =$ max width
  • Addressing
    • $n$ locations means $log_2(n)$ address bits
    • Decoder circuit translates address into 1 of n locations

Now, let’s define some words and terms that we encounter often while working with circuits:

  • Bandwidth:
    • Total amount of data across a device or across an interface, per unit time (usually Bytes/sec)
  • Latency:
    • A measure of the time from a request for a data transfer until the data is received.
  • Memory Interfaces for Accessing Data
    • Asynchronous (unclocked):
      • A change in the address results in data appearing
    • Synchronous (clocked):
      • A change in address, followed by an edge on CLK results in data appearing. Sometimes, multiple requests may be outstanding.
  • Volatile:
    • Looses its state when the power goes off. (the opposite: non-volatile)

Also, just to list out the volatile vs non-volatile list:

  • Volatile:
    • Random Access Memory (RAM):
      • DRAM “dynamic”
      • SRAM “static”
  • Non-volatile:
    • Read Only Memory (ROM):
      • Mask ROM “mask programmable”
      • EPROM “electrically programmable”
      • EEPROM “erasable electrically programmable”
      • FLASH memory - similar to EEPROM with programmer integrated on chip

Memory blocks can be (and often are) used to implement combinational logic functions.

Examples:

  • LUTs in FPGAs
  • 1Mbit x 8 EPROM can implement 8 independent functions each of $log_2(1M) = 20$ inputs.
  • The decoder part of a memory block can be considered a “minterm generator”.
  • The cell array part of a memory block can be considered an OR function over a subset of rows.