|
Advanced Lua培训 |
|
班级人数--热线:4008699035 手机:15921673576( 微信同号) |
增加互动环节,
保障培训效果,坚持小班授课,每个班级的人数限3到5人,超过限定人数,安排到下一期进行学习。 |
授课地点及时间 |
上课地点:【上海】:同济大学(沪西)/新城金郡商务楼(11号线白银路站) 【深圳分部】:电影大厦(地铁一号线大剧院站)/深圳大学成教院 【北京分部】:北京中山学院/福鑫大楼 【南京分部】:金港大厦(和燕路) 【武汉分部】:佳源大厦(高新二路) 【成都分部】:领馆区1号(中和大道) 【广州分部】:广粮大厦 【西安分部】:协同大厦 【沈阳分部】:沈阳理工大学/六宅臻品 【郑州分部】:郑州大学/锦华大厦 【石家庄分部】:河北科技大学/瑞景大厦
开班时间(连续班/晚班/周末班):请点击此处咨询在线客服 |
课时 |
◆资深工程师授课
☆注重质量
☆边讲边练
☆若学员成绩达到合格及以上水平,将获得免费推荐工作的机会
★查看实验设备详情,请点击此处★ |
质量以及保障 |
☆
1、如有部分内容理解不透或消化不好,可免费在以后培训班中重听;
☆ 2、在课程结束之后,授课老师会留给学员手机和E-mail,免费提供半年的课程技术支持,以便保证培训后的继续消化;
☆3、合格的学员可享受免费推荐就业机会。
☆4、合格学员免费颁发相关工程师等资格证书,提升您的职业资质。 |
☆课程大纲☆ |
|
- The course is divided into three separate days, the third being optional.
- Day 1 Machine Learning & Deep Learning: theoretical concepts
1. Introduction IA, Machine Learning & Deep Learning
- History, basic concepts and usual applications of artificial intelligence far
- Of the fantasies carried by this domain
- Collective Intelligence: aggregating knowledge shared by many virtual agents
- Genetic algorithms: to evolve a population of virtual agents by selection
- Usual Learning Machine: definition.
- Types of tasks: supervised learning, unsupervised learning, reinforcement learning
- Types of actions: classification, regression, clustering, density estimation, reduction of
- dimensionality
- Examples of Machine Learning algorithms: Linear regression, Naive Bayes, Random Tree
- Machine learning VS Deep Learning: problems on which Machine Learning remains
- Today the state of the art (Random Forests & XGBoosts)
- 2. Basic Concepts of a Neural Network (Application: multilayer perceptron)
- Reminder of mathematical bases.
- Definition of a network of neurons: classical architecture, activation and
- Weighting of previous activations, depth of a network
- Definition of the learning of a network of neurons: functions of cost, backpropagation,
- Stochastic gradient descent, maximum likelihood.
- Modeling of a neural network: modeling input and output data according to
- The type of problem (regression, classification ...). Curse of dimensionality. Distinction between
- Multifeature data and signal. Choice of a cost function according to the data.
- Approximation of a function by a network of neurons: presentation and examples
- Approximation of a distribution by a network of neurons: presentation and examples
- Data Augmentation: how to balance a dataset
- Generalization of the results of a network of neurons.
- Initialization and regularization of a neural network: L1 / L2 regularization, Batch
- Normalization ...
- Optimization and convergence algorithms.
- 3. Standard ML / DL Tools
- A simple presentation with advantages, disadvantages, position in the ecosystem and use is planned.
- Data management tools: Apache Spark, Apache Hadoop
- Tools Machine Learning: Numpy, Scipy, Scikit
- DL high level frameworks: PyTorch, Keras, Lasagne
- Low level DL frameworks: Theano, Torch, Caffe, Tensorflow
- Day 2 Convolutional and Recurrent Networks
4. Convolutional Neural Networks (CNN).
- Presentation of the CNNs: fundamental principles and applications
- Basic operation of a CNN: convolutional layer, use of a kernel,
- Padding & stride, feature map generation, pooling layers. Extensions 1D, 2D and
- 3D.
- Presentation of the different CNN architectures that brought the state of the art in classification
- Images: LeNet, VGG Networks, Network in Network, Inception, Resnet. Presentation of
- Innovations brought about by each architecture and their more global applications (Convolution
- 1x1 or residual connections)
- Use of an attention model.
- Application to a common classification case (text or image)
- CNNs for generation: superresolution, pixeltopixel segmentation. Presentation of
- Main strategies for increasing feature maps for image generation.
- 5. Recurrent Neural Networks (RNN).
- Presentation of RNNs: fundamental principles and applications.
- Basic operation of the RNN: hidden activation, back propagation through time,
- Unfolded version.
- Evolutions towards the Gated Recurrent Units (GRUs) and LSTM (Long Short Term Memory).
- Presentation of the different states and the evolutions brought by these architectures
- Convergence and vanising gradient problems
- Classical architectures: Prediction of a temporal series, classification ...
- RNN Encoder Decoder type architecture. Use of an attention model.
- NLP applications: word / character encoding, translation.
- Video Applications: prediction of the next generated image of a video sequence.
- Day 3 Generational Models and Reinforcement Learning
6. Generational models: Variational AutoEncoder (VAE) and Generative Adversarial Networks (GAN).
- Presentation of the generational models, link with the CNNs seen in day 2
- Autoencoder: reduction of dimensionality and limited generation
- Variational Autoencoder: generational model and approximation of the distribution of a
- given. Definition and use of latent space. Reparameterization trick. Applications and
- Limits observed
- Generative Adversarial Networks: Fundamentals. Dual Network Architecture
- (Generator and discriminator) with alternate learning, cost functions available.
- Convergence of a GAN and difficulties encountered.
- Improved convergence: Wasserstein GAN, Began. Earth Moving Distance.
- Applications for the generation of images or photographs, text generation, super
resolution.
- 7. Deep Reinforcement Learning.
- Presentation of reinforcement learning: control of an agent in a defined environment
- By a state and possible actions
- Use of a neural network to approximate the state function
- Deep Q Learning: experience replay, and application to the control of a video game.
- Optimization of learning policy. Onpolicy && offpolicy. Actor critic
- architecture. A3C.
- Applications: control of a single video game or a digital system.
|
|
|
|
|
|