# TensorFlow: A primer

## A simple introduction to TensorFlow

TensorFlow (TF) is not working like a typical program. You are probably used to write mathematical operations like `a = 2+2`

where `a`

is equal to 4 right ?

Well, TF blurs the line between mathematical operations and the actual results of them and you are going to end up with `a`

equal to a … `Tensor`

, WTF?

When you write math in TF, you have to think about it as an architect. You are **designing** operations and **not** **calculating** things. Calculus will happens in the next phase: everything that “happens” in TF, “happens” within a **Session**. So when you “add” something in TF, you are designing an “add” operation, not actually adding anything.

All those operations are organised as a **Graph, **your **Graph **holds **operations and tensors**, not values**.**

when you “add” something in TF, you are designing an “add” operation, not actually adding anything

When you start what is called a **Session**, you actually create a new scope for your program where operations can “happens”. This is where you can **run or evaluate** operations and tensors. And when you do so, results start to unravel: tensors get filled with real values, operations get computed, results are obtained, functions converge, etc.

But as soon as you get out of the scope of your **Session**, the **Graph** returns to its static and quite boring state, we are back to theory. To sums up, you have 2 main phase in TF’s code:

- The
**Graph**level: You can design mathematical operations which will be the different bricks of your**Graph**. At this level, you can**only save**your**Graph**itself and its metadata, nothing tangible exists yet. - The
**Session**and**evaluation**level: variables get initialised, other book keeping functions gets configured, operations get executed, intermediary tensors and gradients get calculated, etc.

Now a little tips: the most important part is that only variables keeps their data between multiple evaluation. All other tensors are temporary which means that they will be destroyed and **inaccessible in your training for-loop without a proper feed_dict**.

Only variables keeps their data between multiple evaluations

### TensorFlow best practice series

This article is part of a more complete series of articles about TensorFlow. I’ve not yet defined all the different subjects of this series, so if you want to see any area of TensorFlow explored, add a comment! So far I wanted to explore those subjects (this list is subject to change and is in no particular order):

- A primer (this one :) )
- How to handle shapes in TensorFlow
- TensorFlow saving/restoring and mixing multiple models
- How to freeze a model and serve it with a python API
- TensorFlow: A proposal of good practices for files, folders and models architecture
- TensorFlow howto: a universal approximator inside a neural net
- How to optimise your input pipeline with queues and multi-threading
- Mutating variables and control flow
- How to handle preprocessing with TensorFlow.
- How to control the gradients to create custom back-prop with, or fine-tune my models.
- How to monitor my training and inspect my models to gain insight about them.

Note:TF is evolving fast right now, those articles are currently written for the1.0.0version.