ketan agrawal

features
Last modified on September 23, 2021

Links to “features”

Conditional generative models (Introduction > Conditional generative models)

They blur the line between generative and discriminative, because they also condition on some input features.

\(P(X|Y=Bedroom)\)

Superresolution: p(high-res signal | low-res signal)
Inpainting: p(full image | mask)
Colorization: p(color image | greyscale)
Translation: p(English text | Chinese text)
Text-to-Image: p(image | caption)

Node-level Features (Traditional Graph ML Methods > Node-level Features)

Different ways to model centrality:

Link-level Features (Traditional Graph ML Methods > Link-level Features)

These can be used in link prediction tasks – i.e., whether two nodes should be connected / will be connected in the future.

Graph-level Features (Traditional Graph ML Methods > Graph-level Features)

Goal: we want features that characterize the structure of an entire graph.

Kernel methods are widely-used for traditional graph-level prediction. The idea is to design kernels instead of feature vectors.

That is, we want some graph feature vector \(\phi(G)\). Basically, bag-of-words for a graph, in that each node has some features, but the ordering / relation between nodes isn’t considered.