A mini imitation of GitHub Copilot using EleutherAI GPT-Neo-2.7B for Emacs
![](https://www.deeplearningdaily.com/wp-content/uploads/2021/07/a-mini-imitation-of-github-copilot-using-eleutherai-gpt-neo-2-7b-for-emacs_60e8caea40bd8-375x210.jpeg)
Second Mate
An open-source, mini imitation of GitHub Copilot using EleutherAI GPT-Neo-2.7B (via Huggingface Model Hub) for Emacs.
This is a much smaller model so will likely not be as effective as Copilot, but can still be interesting to play around with!
Setup
Inference End / Backend
- Set
device
to “cpu” or “cuda” inserve/server.py
- The “priming” is currently done in Python. If you want, modify it to another language or turn it off (priming subjectively seems to help).
- Launch
serve/server.py
. This will launch a Flask app which will allow us to sample the model via REST API.
Emacs
- In
emacs/secondmate.py
, set the URL to “localhost” or the address the API is running on. - Configure Python and script path in