Here is a recent paper with some references: https://www.aclweb.org/anthology/N19-1090/ Here is an example of an earlier paper that uses context-free generation instead of neural generation: https://www.aclweb.org/anthology/W14-1815.pdf You may also be interested in related settings such as conditioning on x having a particular meaning, being in a particular formal language, or following the constraints of a poetic form: https://www.aclweb.org/anthology/P19-1080/, https://www.aclweb.org/anthology/K19-1045.pdf, https://www.aclweb.org/anthology/P17-4008.pdf
Cheers, Jason Eisner
On Wed, Mar 11, 2020 at 10:35 AM saedeh tahery <saedeh.tahery at gmail.com> wrote:
> Dear all,
> Recently, I've found generative models such as SeqGan that can produce
> discrete sequences (i.e., text). I know they can generate some real-like
> texts. However, is there any generative model that can produce a sequence
> containing some specific tokens? E.g., a model generates sentences that
> include the particular word "university".
> Any guidance would be appreciated.
> Best Regards,
> --
> Sa'edeh Tahery
> Ph.D. Student, Artificial Intelligence
> K. N. Toosi University of Technology
> Tehran, Iran
> https://www.linkedin.com/in/saedeh-tahery/
> _______________________________________________
> UNSUBSCRIBE from this page: http://mailman.uib.no/options/corpora
> Corpora mailing list
> Corpora at uib.no
> https://mailman.uib.no/listinfo/corpora
>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: text/html
Size: 3790 bytes
Desc: not available
URL: <https://mailman.uib.no/public/corpora/attachments/20200311/f3a91895/attachment.txt>