-
Notifications
You must be signed in to change notification settings - Fork 0
/
abstract.tex
22 lines (18 loc) · 1.02 KB
/
abstract.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
\begin{slide}[\slideopts,toc={}]{Abstract}
Today's sequence models (such as large language models) in machine
learning (AI) arose from a blend of principle-based design and
empirical discovery, spanning several fields.
\maybepause
This talk describes how the ideas could have emerged from an elementary signal-processing
approach.
\maybepause
This viewpoint offers some features:
\begin{enumerate}
\mpitem Signal processing folks can quickly learn what is happening in a motivated way
\mpitem Machine-learning experts might benefit from signal-processing insights
\mpitem Obvious suggestions for things to try next naturally arise
\end{enumerate}
% \href{https://ccrma.stanford.edu/ccrma-open-house}{[Open House Schedule]}
%\textbf{Overheads and more:} \href{https://ccrma.stanford.edu/~jos/Welcome.html#dsponline24}{https://ccrma.stanford.edu/\~{}jos/Welcome.html\#dsponline24}
%\textbf{Overheads and more:} \href{https://ccrma.stanford.edu/~jos/Welcome.html#dsponline24}{https://ccrma.stanford.edu/~jos/Welcome.html#dsponline24}
\end{slide}