ERNIE's here — is OSCAR next?
« previous post | next post »
In "Contextualized Muppet Embeddings" (2/13/2019) I noted the advent of ELMo ("Embeddings from Language Models") and BERT ("Bidirectional Encoder Representations from Transformers"), and predicted ERNiE, GRoVEr, KERMiT, …
I'm happy to say that the first of these predictions has come true:
"Baidu’s ERNIE 2.0 Beats BERT and XLNet on NLP Benchmarks", Synced 7/30/2019
"Baidu unveils ERNIE 2.0 natural language framework in Chinese and English", VentureBeat 7/30/2019
Actually I'm late reporting this, since ERNIE 1.0 came out in March:
"Baidu’s ERNIE Tops Google’s BERT in Chinese NLP Tasks", Synced 3/25/2019
But I'm ashamed to say that the Open System for Classifying Ambiguous Reference (OSCAR) is still just an idea, though I did recruit a collaborator who agreed in principle to work with me on it.
Mike Maltz said,
August 2, 2019 @ 2:50 pm
As the originator and sole member of the Committee for the Abolition of Contrived Acronyms, I take strong exception to using the names of these Sesame Street characters in this way.
[(myl) I believe that I've encountered other members of CACA — are you sure that you're the only one?]
peter siegelman said,
August 2, 2019 @ 3:34 pm
I know it doesn't really count, but there is already an OSCAR–the Online System for Clerkship Application and Review—which the federal courts use for handling judicial clerkship applications.
Andrew (not the same one) said,
August 2, 2019 @ 3:52 pm
Well, for that matter ERNIE has for a long time been the Electronic Random Number Indicator Equipment.
Roscoe said,
August 2, 2019 @ 4:22 pm
The real question is how "ERNIE tops…BERT" made it past the censors.
Rick Rubenstein said,
August 2, 2019 @ 7:52 pm
I suspect SNUFFLEUPAGUS may have to await the Singularity.
[(myl) Um, see "Several New Ultimate Feature Finders Letting Embeddings Use Procedurally Acquired Global Universal Structure":
]
Philip Taylor said,
August 3, 2019 @ 5:34 am
… and "OSCAR" is also "Orbital Satellite Carrying Amateur Radio".
Chris said,
August 6, 2019 @ 9:52 am
KERMIT: Generative Insertion-Based Modeling for Sequences
https://arxiv.org/abs/1906.01604
The acronym stands for "Kontextuell Encoder Representations Made by Insertion Transformations", where Kontextuell is a German word with English inflection, would have an 'e' at the end otherwise.
Are code-mixed acronyms the future?