TransWikia.com

How to pronounce "ReLU" (Rectified Linear Unit)?

English Language & Usage Asked by Jonathan Dayton on March 12, 2021

A Rectified Linear Unit is a common activation function in deep neural networks and is often abbreviated as “ReLU“. I usually pronounce it as /rel-you/ (with the “e” as in “relative” or “rectified”), but I often hear colleagues pronounce it /ray-loo/.

Is there a consensus on the correct pronunciation of “ReLU“?

One Answer

I've always heard professors in classes I've taken at Stanford, such as CS230 and CS231N, pronounce it as /ray-loo/ and /reh-loo/, and I've always used /ray-loo/.

However, the lecturer in this video uses the /rel-you/ pronunciation, which prompted the search that brought me here. The lecturer clearly has extensive ML experience.

I'd hazard a guess based on my anecdotes that /ray-loo/ is more common, but I'd definitely say all three seem to be accepted, given the involvement all parties have in the ML community.

Answered by Josh Wolff on March 12, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP