Official TensorFlow background

TensorFlow: A primer

A simple introduction to TensorFlow

Morgan
metaflow-ai
Published in
3 min readNov 15, 2016

--

TensorFlow (TF) is not working as a typical program. You are probably used to write mathematical operations like a = 2+2 where a is equal to 4 right?

Well, TF blurs the line between mathematical operations and the actual results of them and you are going to end up with a equal to a … Tensor, WTF?

Simple add operation in TensorFlow

When you write math in TF, you have to think about it as an architect. You are designing operations and not calculating things. Calculus will happen in the next phase: everything that “happens” in TF, “happens” within a Session. So when you “add” something in TF, you are designing an “add” operation, not actually adding anything.

All those operations are organised as a Graph, your Graph holds operations and tensors, not values.

when you “add” something in TF, you are designing an “add” operation, not actually adding anything

When you start what is called a Session, you actually create a new scope for your program where operations can “happen”. This is where you can run or evaluate operations and tensors. And when you do so, results start to unravel: tensors get filled with real values, operations get computed, results are obtained, functions converge, etc.

But as soon as you get out of the scope of your Session, the Graph returns to its static and quite boring state, we are back to theory. To sums up, you have 2 main phase in TF’s code:

  • The Graph level: You can design mathematical and control flow operations which will be the different parts of your Graph. At this level, you can only save your Graph itself and its metadata, nothing tangible exists yet.
  • The Session and evaluation level: variables get initialised, other bookkeeping functions gets configured, operations get executed, intermediary tensors and gradients get calculated, etc.

The most important part is that only variables keep their data between multiple session evaluation. All other tensors are temporary which means that they will be destroyed and inaccessible in your training for-loop without a proper feed_dict or any other input pipeline of your choice.

See for yourself:

Only variables keep their data between multiple evaluations

TensorFlow best practice series

This article is part of a more complete series of articles about TensorFlow. I’ve not yet defined all the different subjects of this series, so if you want to see any area of TensorFlow explored, add a comment! So far I wanted to explore those subjects (this list is subject to change and is in no particular order):

Note: TF is evolving fast right now, those articles are currently written for the 1.0.0 version.

References

--

--

ML engineer & Tech lead. (Former co-founder and CTO @Explee, lenia_nft). ML and crypto enthusiast.