Smart Tool, Apps, Solution – URL Link List

Teamgantt.com

Free onlinegantt chart maker software. TeamGantt is the refreshing solution that brings project scheduling software online. You can now plan and manage your projects with this super easy to use gantt software. Inviting your co-workers, teammates, and friends to view and edit your gantt chart.

bettyblocks.com

Betty Blocks no-code application development platform

No-code platform that allows to create mobile, business and web applications at lightning speed with Betty Blocks

peopledatalabs.com

empower developers, engineers, and data scientists to build and scale innovative, data-driven products using high-quality, always-accurate B2B data

wdl-ol

WDL / IPlug (Oli Larkin Edition) IPlug is a simple-to-use C++ framework for developing cross platform audio plugins and targeting multiple plugin APIs with the same code.

AI/ML – notes

‘flood-fill’ algorithm takes three parameters: a start node, a target color, and a replacement color. The algorithm looks for all nodes in the array that are connected to the start node by a path of the target color and changes them to the replacement color.

‘Boundary Fill’ Algorithm starts at a pixel inside the polygon to be filled and paints the interior proceeding outwards towards the boundary. This algorithm works only if the color with which the region has to be filled and the color of the boundary of the region are different.

‘Flat-field correction(FFC)’ is a technique used to improve quality in digital imaging. It cancels the effects of image artifacts caused by variations in the pixel-to-pixel sensitivity of the detector and by distortions in the optical path. It is a standard calibration procedure in everything from personal digital camerasto large telescopes.

NLP Word2Vec Model

Word2Vec is a statistical method for efficiently learning a standalone word embedding from a text corpus.

Additionally, the work involved analysis of the learned vectors and the exploration of vector math on the representations of words. For example, that subtracting the “man-ness” from “King” and adding “women-ness” results in the word “Queen“, capturing the analogy “king is to queen as man is to woman

Two different learning models were introduced that can be used as part of the word2vec approach to learn the word embedding; they are:

  • Continuous Bag-of-Words, or CBOW model.
  • Continuous Skip-Gram Model.

The CBOW model learns the embedding by predicting the current word based on its context. The continuous skip-gram model learns by predicting the surrounding words given a current word.

The continuous skip-gram model learns by predicting the surrounding words given a current word.

Sample Code

import numpy as np from sklearn.manifold import TSNE import matplotlib.pyplot as plt

### download the word vectors

import gensim.downloader as api word2vec_model = api.load(‘word2vec-google-news-300’)  

### vector representation of a word

word2vec_model[“beautiful”]

### word2vec_model.most_similar(“girl”)

word2vec_model.most_similar(“girl”)

### queen – girl + boy = king

word2vec_model.most_similar(positive=[‘boy’, ‘queen’], negative=[‘girl’], topn=1)

### Vacab

vocab = [“boy”, “girl”, “man”, “woman”, “king”, “queen”, “banana”, “apple”, “mango”, “fruit”, “coconut”, “orange”] def tsne_plot(model): labels = [] wordvecs = [] for word in vocab: wordvecs.append(model[word]) labels.append(word) tsne_model = TSNE(perplexity=3, n_components=2, init=’pca’, random_state=42) coordinates = tsne_model.fit_transform(wordvecs) x = [] y = [] for value in coordinates: x.append(value[0]) y.append(value[1]) plt.figure(figsize=(8,8)) for i in range(len(x)): plt.scatter(x[i],y[i]) plt.annotate(labels[i], xy=(x[i], y[i]), xytext=(2, 2), textcoords=’offset points’, ha=’right’, va=’bottom’) plt.show() tsne_plot(word2vec_model)