- Instant help with your Java coding problems

Serving robots.txt in Spring Boot

Serving robots.txt is quite an easy task in Spring Boot. As always there are more choices here. This article presents three possible solutions using Spring Boot 2. 

Serving robots.txt without controller

The easiest solution is to create a robots.txt file in the application's static directory. Following the Spring folder conventions this is: /src/main/resources/static/robots.txt

That's all. However, in real life, you are most likely to use some form of request authorization. In this case, make sure robots.txt is accessible. So don't forget to add it as an exception to your security configuration.

@Configuration
public class SecurityConfig extends WebSecurityConfigurerAdapter {
    ...
    @Override
    protected void configure(HttpSecurity http) throws Exception {
        http.authorizeRequests()
            .antMatchers("/robots.txt").permitAll()
            ...
    }
}

 

Serving robots.txt using a controller

An obvious solution is to use a controller for the task. This requires some extra code, but in return, we get control over the output. A simple example code:

@Controller
public class RobotsController {

    @RequestMapping("/robots.txt")
    @ResponseBody
    public String robots() {
        return "User-agent: *\n" +
                "Disallow: /admin/controller";
    }

}

Note that you still need to configure security as shown in the previous section.

 

A flexible solution: controller and config file

The problem with the above solutions is that you can't easily modify the contents of robots.txt without redeploying the application. 

One way to solve this is to read the contents of your robots.txt file from an external file. This can be, for example, in addition to the external configuration file. 

The concept is to try to read the contents of the external file and if it succeeds, we send this back to the browser. If the scan fails because, for example, there is no such file, then we generate an output as seen in the previous point.

In this case, we need to add an extra field to the application's configuration file that tells us where to look for the external robots.txt file.

 

appName:
    ...
    robotsTxt-location: /var/myapp/config/robots.txt

After that, you just need to add some extra code used in this previous solution:

    @Value("${appName.robotsTxt-location:robots.txt}")
    private String robotsTxtLocation;

    @RequestMapping("/robots.txt")
    @ResponseBody
    public String robots() {
        try {
            File robotsTxt = new File(robotsTxtLocation);
            return new String(Files.readAllBytes(robotsTxt.toPath()));
        } catch (IOException e) {
            return "User-agent: *" + System.lineSeparator() +
                    "Disallow: /admin/";
        }
    }

 

Table of contents
Related articles
Trending articles