- Back to Home »
- Sequence Learning
Sequence Learning Problems
In Sequence Learning Problems, the two properties of FCNN and CNNs do not
hold and the output here at any timestep depends on previous input/output and
the length of the input is not fixed.
1.Sequence Prediction:
Sequence prediction involves predicting the next value for a given input sequence.
For example: Input Sequence: 1, 2, 3, 4, 5 Output Sequence: 6
Examples:
1.Weather Forecasting. Given a sequence of observations about the weather
over time, predict the expected weather tomorrow.
2.Stock Market Prediction. Given a sequence of movements of a security over
time, predict the next movement of the security.
3.Product Recommendation. Given a sequence of past purchases for a customer,
predict the next purchase for a customer.
2.Sequence Classification:
Sequence
classification involves predicting a class label for a given input sequence.
For example: Input Sequence: 1, 2, 3,
4, 5 Output Sequence: "good"
Examples:
1.DNA Sequence Classification. Given a DNA sequence of A, C, G, and T
values, predict whether the sequence is for a coding or non-coding region.
2.Anomaly Detection. Given a sequence of observations, predict whether the
sequence is anomalous or not.
3.Sentiment Analysis. Given a sequence of text such as a review or a tweet,
predict whether the sentiment of the text is positive or negative.
3. Sequence Generation
Sequence generation involves
generating a new output sequence that has the same general characteristics as
other sequences in the corpus.
For example: Input Sequence: [1, 3, 5], [7, 9, 11] Output Sequence: [3, 5
,7]
Examples:
1.Text Generation. Given a corpus of text, such as the works of Shakespeare,
generate new sentences or paragraphs of text that read they could have been
drawn from the corpus.
2.Handwriting Prediction. Given a corpus of handwriting examples, generate
handwriting for new phrases that has the properties of handwriting in the
corpus.
3.Music Generation. Given a corpus of examples of music, generate new
musical pieces that have the properties of the corpus.
4.Image Caption Generation. Given an image as input, generate a sequence of
words that describe an image. For example: Input Sequence: [image pixels]
Output Sequence: ["man riding a bike"]
4. Sequence-to-Sequence
Prediction Sequence-to-sequence prediction involves predicting an output
sequence given an input sequence.
For example: Input Sequence: 1, 2, 3, 4, 5 Output Sequence: 6, 7, 8, 9, 10
Examples:
Multi-Step Time Series Forecasting. Given a time series of observations,
predict a sequence of observations for a range of future time steps.
Text Summarization. Given a document of text, predict a shorter sequence of
text that describes the salient parts of the source document.
Program Execution. Given the textual description program or mathematical
equation predict the sequence of characters that describes the correct output.