@berpj Hi Pierre, just followed you, I think this is incredible and funny. Just wondering how you did this in non-technical terms so I could make one for my celebrities. Thank you
@berpj@jevinsew I'm pretty sure all tweets are post-processed and corrected AND specifically hand-picked from a big generated set. LSTM (word or character level) is not currently capable of generating perfect sentences with at least some meaning, it only happens by relatively small chance. I used my 17.7k tweets as a dataset for experiments with models, which is 8 times more then Mask's and is generally a good amount of data for RNN learning, and that's why I'm so sure.
@albandum Hey! I used this dataset: https://github.com/berpj/elon-mu... (only transcripts from speeches). That's why there is no hashtags or links. It would be cool to train this AI on his tweets, but the dataset would be too small.
Would you be open to sharing the source for this? I want to get into this type of machine learning/ai and love learning by example. Alternatively: What resources would you recommend to start building/learning ai/ml.
Edit: just saw your answer on this subject to @jevinsew and @vladzima.
@berpj@samschooler@jevinsew I used word level char rnn to generate new laws based on those passed by Russian parliament in the past. That shit was hilarious to the level when it printed "We are not aware what we are doing, let's proceed to execution" at the end of one document.
Request Server
Request Server
SubscriptionZero
Request Server
Request Server
Request Server
Request Server
Request Server