You can easily create a new or edit an existing robots.txt file for your site with a robots.txt generator. To upload an existing file and pre-populate the robots.txt file generator tool, type or paste the root domain URL in the top text box and click Upload. Use the robots.txt generator tool to create directives with either Allow or Disallow directives (Allow is default, click to change) for User Agents (use * for all or click to select just one) for specified content on your site. Click Add directive to add the new directive to the list. To edit an existing directive, click Remove directive, and then create a new one.
Create custom user agent directives
In our robots.txt generator Google and several other search engines can be specified within your criteria. To specify alternative directives for one crawler, click the User Agent list box (showing * by default) to select the bot. When you click Add directive, the custom section is added to the list with all of the generic directives included with the new custom directive. To change a generic Disallow directive into an Allow directive for the custom user agent, create a new Allow directive for the specific user agent for the content. The matching Disallow directive is removed for the custom user agent.