Jump to content

Let’s make Machine Learning easy with scikit-learn on LabVIEW


Recommended Posts

  • 2 weeks later...
  • 3 months later...
  • 2 weeks later...

Dear Community,

 

The HAIBAL project is structured in the same way as Keras.

The project consists of more than 3000 VIs including, all is coded in LabVIEW native:😱😱😱

  • 16 activations (ELU, Exponential, GELU, HardSigmoid, LeakyReLU, Linear, PReLU, ReLU, SELU, Sigmoid, SoftMax, SoftPlus, SoftSign, Swish, TanH, ThresholdedReLU), nonlinear mathematical function generally placed after each layer having weights.
  • 84 functional layers/layers (Dense, Conv, MaxPool, RNN, Dropout, etc…).
  • 14 loss functions (BinaryCrossentropy, BinaryCrossentropyWithLogits, Crossentropy, CrossentropyWithLogits, Hinge, Huber, KLDivergence, LogCosH, MeanAbsoluteError, MeanAbsolutePercentage, MeanSquare, MeanSquareLog, Poisson, SquaredHinge), function evaluating the prediction in relation to the target.
  • 15 initialization functions (Constant, GlorotNormal, GlorotUniform, HeNormal, HeUniform, Identity, LeCunNormal, LeCunUniform, Ones, Orthogonal, RandomNormal, Random,Uniform, TruncatedNormal, VarianceScaling, Zeros), function initializing the weights.
  • 7 Optimizers (Adagrad, Adam, Inertia, Nadam, Nesterov, RMSProp, SGD), function to update the weights.

 

 

 Currently, we are working on the full integration of Keras in compatibility HDF5 file and will start soon same job for PyTorch. (we are able to load model from and will able to save model to in the future – this part is important for us).

Well obviously, Cuda is already working if you use Nvidia board and NI FPGA board will also be – not done yet.

We also working on the full integration on all Xilinx Alveo system for acceleration.

User will be able to do all the models he wants to do; the only limitation will be his hardware. (we will offer the same liberty as Keras or Pytorch) and in the future our company could propose Harware (Linux server with Xilinx Alveo card for exemple --> https://www.xilinx.com/products/boards-and-kits/alveo.html All full compatible Haibal !!!)

 

About the project communication: 

The website will be completely redone, a Youtube channel will be set up with many tutorials and a set of known examples will be offered within the library (Yolo, Mnist, etc.).

For now, we didn’t define release date, but we thought in the next July (it’s not official – we do our best to finish our product but as we are a small passionate team (we are 3 working on it) we do our best to release it soon).

 

This work is titanic and believe me it makes us happy that you encourage us in it. (it boosts us). In short, we are doing our best to release this library as soon as possible.

Still a little patience …

 

Youtube Video : 

 

This exemple is a template state machine using HAIBAL library.

It show a signal (here it's Cos) and the neural network during his training has to learn to predict this signal  (here we choose 40 neurones by layers, 5 layers, layer choose is dense).

This template will be proposed as basic example to understood how we initialize, train and use neural network model.  This kind of "visualisation exemple" is inspired from https://playground.tensorflow.org/ help who want to start to learn deep learning.

 

4.png

2.png

Edited by Youssef Menjour
  • Thanks 2
Link to comment
23 minutes ago, ShaunR said:

I don't have a use case for this but I love that the API uses plymorphic VI's-a vastly underused feature IMO.

If you want to organise the functions instead of having a huge linear list, you can group them by separating the menu item with a colon ":".

As described here, I just discovered: https://zone.ni.com/reference/en-XX/help/371361R-01/lvhowto/editing_the_shortcut_menu/

  • Like 1
Link to comment

Thank you very much for your encouragement. 😀
Yes, we can confirm that it took a lot of work and that your encouragement pushes us to do more !

We also thank you for your remarks of improvements (we are interested!! the objective of HAIBAL is to make a user friendly library) 

I (Youssef Menjour) have always liked LabVIEW and artificial intelligence and it was frustrating not to have an efficient tool at our disposal.

We will start sharing more and more examples in the next few days.🧑‍🎓
We will also soon propose a free library to pilot a drone easily affordable on amazon because there will be with HAIBAL an example of autopilot assisted by AI of this drone (and a complete Tutorial on youtube).

We are also thinking about doing the same with the "mini sheetah" type robot.

In short, it will move in the next weeks, we still have a lot of work and once again your encouragement makes us really happy. 

LabVIEW without AI is a thing of the past. 💪

 

 

 

This exemple is a template state machine using HAIBAL library.

It show a signal (here it's sinc) and the neural network during his training has to learn to predict this signal  (here we choose 50 neurones by layers, 10 layers, layer choose is dense).

This template will be proposed as basic example to understood how we initialize, train and use neural network model.  This kind of "visualisation exemple" is inspired from https://playground.tensorflow.org/ help who want to start to learn deep learning.

 

Edited by TDF
  • Like 1
Link to comment
  • 2 weeks later...

The HAIBAL toolbox will propose to the user to make his own architecture / training / prediction natively on LabVIEW.
Of course we will propose natively numerous exemple like Yolo, Minst, VGG ... (that user can directly use and modify it)

As our toolkit is new, we made the choice to be also fully compatible with Keras. This means that if you already have a model trained on Keras, it will be possible to import it on HAIBAL. This will also open our library to thousands of models available on the internet. (all import will traduct on HAIBAL native labVIEW code as editable by users)

LV_Logo_PowerdBy_centered-wide.png


In this case, you will have two choices;
1 - use it on labVIEW (predict / train)

2 - Generate all the native architecture equivalent to HAIBAL (as you can see on the video) in order to modify it as you wish.

HAIBAL it's more 3000 VIs, it represente a huge work and we not yet finished. We hope to release the first version this summer (with Cuda) and hope NI-FPGA optimisation to speed up inference. (Open CL and all Xilinx FPGA compatibilities will also come during 2022/2023)

We are building actually our website and our youtube channel.


The teams will propose tutorials (youtube/git hub) and news (website) to give visibilities for users

 

In this video we import Keras VGG-16 model saved in HDF5 format to HAIBAL LabVIEW deep learning library. Then we can generate with our scripting the graph to allow user modify any architecture for his purpose before running it.

Edited by Youssef Menjour
Adding more informations
Link to comment
  • 4 months later...

Hello everyone,
 

 

 

 

It is now time for us to communicate about the project. First of all, thank you for the support that some of you have given us during this summer. We didn't go on vacation and continued to work on the project.

We didn't communicate much because we were so busy. 

HAIBAL will be released soon, with some delay but it will be released soon.

We have solved many of our problems and are actively continuing development. The release 1 should be coming soon and we are thinking of setting up a free beta version for the community to give us feedback on the product. (What improvements would you like to see!)

For the official release, we might be a little bit late because the graphic part is not advanced yet. We still have to make a website and a youtube channel for the HAIBAL project. I don't even talk about the design of the icons which has not been started yet.

In short, the designer works a lot.

In the meantime, here is the promotional video of HAIBAL !

See you soon the community ! Be patient, the revolution is coming

HAIBAL link.png

Edited by Youssef Menjour
Link to comment
14 hours ago, ShaunR said:

Is that some marketing guy's idea of what it would probably look like to use or do you actually have that interface?

image.png.1f82f4fd62b98ab6b2a2a9c305a560dd.png

This video is for the vulgarisation of our product. We took the inspiration from NI : 


It's marketing and vulgarisation because we address the world of deeplearning which does not formally know the graphic language

Link to comment
3 hours ago, Youssef Menjour said:

This video is for the vulgarisation of our product. We took the inspiration from NI : 
It's marketing and vulgarisation because we address the world of deeplearning which does not formally know the graphic language

Then you might want to take a look at Express VI's. There is an Express palette where many of the acquisition and signal analysis fundamental uses are available so that you can see how they operate.

You would be able to create create VI's that operate like your marketing guys portray. The user could then configure each node in a pop-up design-time dialogue. Express VI's are for novice users and those that have minimal programming expertise who you seem to be targeting.

Link to comment
On 8/20/2022 at 4:22 PM, Antoine Chalons said:

Do you mean you want to attract python people so they can use ML in LabVIEW?

The idea is to propose not only a simple library on LabVIEW but also to propose an alternative to Keras and Pytorch.

HAIBAL is a powerful modular library and more practical than python.  Be able to run any deep learning graph on cuda or Xilinx FPGA platform with no particular syntaxes is really good for Ai users. 

We are convinced that doing artificial intelligence with LabVIEW is the way of the future. The graphics language is perfect for this.

The promotional video is not aimed at LabVIEW users. It is for those who want to do deep learning in graphical language. We are addressing engineering students, professors, researchers and R&D departments of companies.

LabVIEW users already know what drag and drop and data flow are. LabVIEW users do not need this kind of video.   

 

 

  • Haha 1
Link to comment
8 hours ago, Youssef Menjour said:

The idea is to propose not only a simple library on LabVIEW but also to propose an alternative to Keras and Pytorch.

HAIBAL is a powerful modular library and more practical than python.  Be able to run any deep learning graph on cuda or Xilinx FPGA platform with no particular syntaxes is really good for Ai users. 

We are convinced that doing artificial intelligence with LabVIEW is the way of the future. The graphics language is perfect for this.

The promotional video is not aimed at LabVIEW users. It is for those who want to do deep learning in graphical language. We are addressing engineering students, professors, researchers and R&D departments of companies.

LabVIEW users already know what drag and drop and data flow are. LabVIEW users do not need this kind of video.   

 

 

What you have made looks great, and I absolutely commend your effort.

But, I think the area of the shaded part of the Venn diagram that consists of people willing to pay for LabVIEW and want to get into Deep Learning using a new and paid for toolkit is a number approaching machine epsilon. NI has abandoned Academic institutions and is introducing a new licensing model that is predatory and harmful to the long term viability of LabVIEW. It breaks my heart...

Edited by Neil Pate
Link to comment

You have two very similar threads on this forum, which makes it difficult to have a linear discussion: https://lavag.org/topic/22489-let%E2%80%99s-make-deep-learning-easy-with-haibal-library-on-labview/#comment-142891

We all agree that graphical programming has tons of advantages over pages of text code.

However, most of us hate to be arm-twisted in having to pay annually for new versions of a language that fixes bug at a snail pace (if ever), release features that take 2 to 3 versions to work as advertised, and is offering abysmal level of technical support (and the list of recriminations could go on and on).

Your toolkit looks truly amazing, but I would argue that you are not alone in having invested massive amount of time and effort in G-code. Yet, some of us who have, have come to the conclusion that we cannot reasonably tie our work to NI's whim and suicidal plans for LabVIEW. I have decided to migrate as fast as I can to Python.

I'd argue that someone who is willing to do so (migrate to Python) will find the resources to master the basics of DL in Python, not mentioning that you'll have to keep running to stay on top of the new developments in the field with your toolkit. How you can hope to do that for free is a mystery to me. As I said, admirable, but a tad idealistic... Sorry for being he bearer of bad news, but "un homme averti en vaut deux", as the French say.

  • Haha 1
  • Sad 1
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.