PyCon-India 2019

PyCon-India 2019 National Level Conference






          Pycon India 2019 National Level conference from 12-10-2019 to 13-10-2019 at Trade Center.
 
          First time I attend this pycon conference but i am not thinking go to attend that pycon conferrence.   In this pycon india to inviting the vglug for present to their poster presentation, vglug means Villupuram GNU Linux Users Group this community first select two person from vglug for pycon conference that time i am not selected because pycon is alloted only two ticket for conference, one day my mentor from VGLUG call to me " hello vignesh you are selected for pycon conference at trade center your train come on time 6.50 so, you are getup to go fast " but now time is 6.30 that time i am not thinking what to do, however i have got to reaching chennai train in correct time. After two and off hours i am reached guindy, 

          This is the greate opportunity for me because it is fully based on python language and it has many stall's on python and it conduct many programs in python.  i really lucky because vglug is selected to provide the python training for me, this is the good platform to improve my knowledge.


          It divides the companies has three parts that is Platinum( Indeed, AQR, Microsoft ), Gold( Visible alpha, MAD, happyfox, AWS, JETBrains, Ericsson, innovaccer, Merit ) and Silver( Zeomega, RedHat, Pramati, GUVI, ERPNext, elastic, Pipal Academy, epi episource, DCKAP, appviewX, KLA, deepsource, Toyota ). 





This is our team in PyCon-India 2019


Day 1


          First i had attended the DCKAP stall.  In this stall is said we are now managing the DCKAP company, then DCKAP is a helping to develop the software applications and web application ( web development ) company.






          After, we are attending all stalls from pycon india with me and friends Dilip, Annapoorani and my organizers Sathish, Kaleel, Vijaya lakshmi, karkee and Ethiraj.






          Now, we are start to attend the conference, pycon is provide 39 conferences, these conferences are very valuable it helps to improve our knowledge about python and how to use python and what are the ways to use python. because python is the simplest languages when compared to other language and in this languages writen like english so, it easily to understand. python is powerfull language, it is used for Artificial Intelligance research and DataScience and etc.,

 Conference :

 Augument Reality : 

            A Python Augmented Reality app - which uses OpenCV and OpenGL to render and interact with 3D robots on 2D markers.

            OpenCV provides the computer vision functionality to detect a 2D marker and yield its translation and rotation vectors.

            OpenGL graphics library uses those vectors to render a Blender-generated  3D robot on top of the marker. New interactive features can be easily added to the app using Python threads.
 





   Libraries for AR :







  Machine Learning Bias :

               Detecting bias in machine learning models has become of great importance in recent times. Bias in a machine learning model is about the model making predictions which tend to place certain privileged groups at a systematic advantage and certain unprivileged groups at a systematic disadvantage. 

         The primary reason for unwanted bias is the presence of biases in the training data, due to either prejudice in labels or under-sampling/over-sampling of data. Especially in the banking, finance. and insurance industries, customers/partners and regulators are asking tough questions to businesses regarding the initiatives taken to avoid and detect bias.

Poster Session :
      In this poster session is very interesting and useful to me because i think this session is covered in  huge part of the python based technologies such as machine learning, data science, animations, web developement, signal processing and sensors, databases and etc.,
  • Web Scraping using python
  • VFX using python
  • Signal processing software for "Ground Penetrating Radar"
  • Python based tools for Scientific Animations
  • Create your own Data Set using python
  • Machine Learning
  
 Web Scraping :

          Imagine you have to pull a large amount of data from websites and you want to do it as quickly as possible. How would you do it without manually going to each website and getting the data? Well, “Web Scraping” is the answer. Web Scraping just makes this job easier and faster.

Other Valuable Programs :
           
           I am participated Quizz program in python ( JETBrains )

 

Day2

 

Programs : 

  • GraphQL API with Django

  • Smart reply for charbot

  • Natural Language Generation   

 

GraphQL API with Django : 

          Web APIs are the engines that power most of our applications today. For many years REST has been the dominant architecture for APIs, but in this article we will explore GraphQL.

REST vs GraphQL :

          With REST APIs, you generally create URLs for every object of data that's accessible. Let's say we are building a REST API for movies - we'll have URLs for the movies themselves, actors, awards, directors, producers... it's already getting unwieldy. This could mean a lot of requests for one batch of related data. Imagine you were the user of a low powered mobile phone over a slow internet connection, this situation isn't ideal.

          GraphQL is not an API architecture like REST, it's a language that allows us to share related data in a much easier fashion. We'll use it to design an API for movies. Afterwards, we'll look at how the Graphene library enables us to build APIs in Python by making a movie API with Django.

 

Natural Language Generation :

           NLG is an area where we are trying to teach machines how to generate NL in a sensible manner. This, in itself, is a challenging AI task. Deep learning has really helped us perform this kind of challenging task. Let me give you an example. If you are using Google's new inbox, then you may notice that when you reply to any mail, you will get three most relevant replies in the form of sentences for the given mail. Google used millions of e-mails and made an NLG model that was trained using deep learning to generate or predict the most relevant reply for any given mail.

 

Prizes :

I got T-shirt for win the JETBrains Quizz program.

 

Discussions part with tea break...


 

finally photoshoot...

           with python logo...

   2 day evening...

font on the trade centre
 


Comments

Post a Comment

Popular posts from this blog

Swap file using python

Addition Game using Python