CloudStatus

Travis-CI Status Alerts

Travis-CI
Reduced capacity macOS

Dec 13, 06:06 UTC Resolved - Capacity is back up.Dec 13, 05:26 UTC Investigating - We are investigating a drop in capacity for macOS builds.

Travis-CI
Reduced capacity macOS

Dec 13, 05:26 UTC Investigating - We are investigating a drop in capacity for macOS builds.

Travis-CI
Decreased capacity macOS builds

Dec 11, 10:15 UTC Resolved - macOS capacity has been restored, and the backlog is processed with full speed.Dec 11, 08:38 UTC Investigating - We are investigating a reduction of the build capacity for macOS builds.

Travis-CI
Decreased capacity macOS builds

Dec 11, 08:38 UTC Investigating - We are investigating a reduction of the build capacity for macOS builds.

Travis-CI
Decreased capacity macOS builds

Dec 11, 08:38 UTC Investigating - We are investigating a reduction of the build capacity for macOS builds.

Travis-CI
Reduced Mac capacity due to emergency maintenance

Dec 7, 16:11 UTC Resolved - Maintenance has completed, builds are running at full capacity again. Thanks for your patience.Dec 7, 15:44 UTC Investigating - Due to high amount of requeues in our mac infrastructure, we are going to reduce capacity while we do an investigation. Builds are still running, but at reduced capacity.

Travis-CI
Reduced Mac capacity due to emergency maintenance

Dec 7, 15:44 UTC Investigating - Due to high amount of requeues in our mac infrastructure, we are going to reduce capacity while we do an investigation. Builds are still running, but at reduced capacity.

Travis-CI
Reduced Mac capacity due to emergency maintenance

Dec 7, 15:44 UTC Investigating - Due to high amount of requeues in our mac infrastructure, we are going to reduce capacity while we do an investigation. Builds are still running, but at reduced capacity.

Travis-CI
macOS maintenance required

Dec 2, 02:04 UTC Resolved - We have resolved the issue with the orphaned VMs, and macOS builds are now back to normal. We thank you for your patience during this time.Dec 1, 23:39 UTC Update - We are no longer seeing delays on our .com infrastructure, but some still remain on .org. We are seeing positive performance changes, however, and will continue to work towards fully resolving it.Dec 1, 22:40 UTC Update - The process to clean up VMs is still ongoing, but is taking longer than expected. We will provide another update as soon as we have anything new to share. We thank you for your patience during this time.Dec 1, 21:50 UTC Update - We are still in the process of cleaning up orphaned VMs, the process is ongoing but slow. Due to this, we expect ongoing reduced capacity. We will provide an update in the next hour or so.Dec 1, 20:40 UTC Investigating - We are bringing down a part of our macOS infrastructure to perform unplanned maintenance. We will be clearing out VMs that are unintentionally left running and restarting them. This will help increase performance for all users. During this time, you may notice degraded macOS performance. We will provide an update in approximately one hour on the status of this maintenance.

Travis-CI
macOS maintenance required

Dec 1, 23:39 UTC Update - We are no longer seeing delays on our .com infrastructure, but some still remain on .org. We are seeing positive performance changes, however, and will continue to work towards fully resolving it.Dec 1, 22:40 UTC Update - The process to clean up VMs is still ongoing, but is taking longer than expected. We will provide another update as soon as we have anything new to share. We thank you for your patience during this time.Dec 1, 21:50 UTC Update - We are still in the process of cleaning up orphaned VMs, the process is ongoing but slow. Due to this, we expect ongoing reduced capacity. We will provide an update in the next hour or so.Dec 1, 20:40 UTC Investigating - We are bringing down a part of our macOS infrastructure to perform unplanned maintenance. We will be clearing out VMs that are unintentionally left running and restarting them. This will help increase performance for all users. During this time, you may notice degraded macOS performance. We will provide an update in approximately one hour on the status of this maintenance.

Travis-CI
macOS maintenance required

Dec 1, 22:40 UTC Update - The process to clean up VMs is still ongoing, but is taking longer than expected. We will provide another update as soon as we have anything new to share. We thank you for your patience during this time.Dec 1, 21:50 UTC Update - We are still in the process of cleaning up orphaned VMs, the process is ongoing but slow. Due to this, we expect ongoing reduced capacity. We will provide an update in the next hour or so.Dec 1, 20:40 UTC Investigating - We are bringing down a part of our macOS infrastructure to perform unplanned maintenance. We will be clearing out VMs that are unintentionally left running and restarting them. This will help increase performance for all users. During this time, you may notice degraded macOS performance. We will provide an update in approximately one hour on the status of this maintenance.

Travis-CI
macOS maintenance required

Dec 1, 21:50 UTC Update - We are still in the process of cleaning up orphaned VMs, the process is ongoing but slow. Due to this, we expect ongoing reduced capacity. We will provide an update in the next hour or so.Dec 1, 20:40 UTC Investigating - We are bringing down a part of our macOS infrastructure to perform unplanned maintenance. We will be clearing out VMs that are unintentionally left running and restarting them. This will help increase performance for all users. During this time, you may notice degraded macOS performance. We will provide an update in approximately one hour on the status of this maintenance.

Travis-CI
macOS maintenance required

Dec 1, 21:50 UTC Update - We are still in the process of cleaning up orphaned VMs, the process is ongoing but slow. Due to this, we expect ongoing reduced capacity. We will provide an update in the next hour or so.Dec 1, 20:40 UTC Investigating - We are bringing down a part of our macOS infrastructure to perform unplanned maintenance. We will be clearing out VMs that are unintentionally left running and restarting them. This will help increase performance for all users. During this time, you may notice degraded macOS performance. We will provide an update in approximately one hour on the status of this maintenance.

Travis-CI
macOS maintenance required

Dec 1, 20:40 UTC Investigating - We are bringing down a part of our macOS infrastructure to perform unplanned maintenance. We will be clearing out VMs that are unintentionally left running and restarting them. This will help increase performance for all users. During this time, you may notice degraded macOS performance. We will provide an update in approximately one hour on the status of this maintenance.

Travis-CI
Chrome Builds Failing

Nov 30, 19:33 UTC Resolved - Workarounds and further updates may be found in https://github.com/travis-ci/travis-ci/issues/8836Nov 30, 16:21 UTC Identified - We are currently seeing builds using Chrome failing because of a spurious permission change. Workaround and updates can be found here. Thank you for your patience!

Travis-CI
Chrome Builds Failing

Nov 30, 16:21 UTC Identified - We are currently seeing builds using Chrome failing because of a spurious permission change. Workaround and updates can be found here. Thank you for your patience!

Travis-CI
Network connectivity issues for public and private builds

Nov 30, 13:41 UTC Resolved - This incident has been resolved.Nov 30, 11:03 UTC Monitoring - The network connectivity errors seem to be resolved as of, approximately, 10:25 UTC time. We continue monitoring the situation and we recommend restarting any affected builds and contact us at support@travis-ci.com if these are still failing because of download errors.Nov 30, 10:27 UTC Investigating - We’re currently investigating reports of network reachability issues that are causing some downloads to fail during build time. These reachability issues seem to be affecting downloads from Launchpad, Nodejs.org, Gradle or Jitpack. We will provide more updates as soon we get them. Thanks for your understanding.

Travis-CI
Network connectivity issues for public and private builds

Nov 30, 11:03 UTC Monitoring - The network connectivity errors seem to be resolved as of, approximately, 10:25 UTC time. We continue monitoring the situation and we recommend restarting any affected builds and contact us at support@travis-ci.com if these are still failing because of download errors.Nov 30, 10:27 UTC Investigating - We’re currently investigating reports of network reachability issues that are causing some downloads to fail during build time. These reachability issues seem to be affecting downloads from Launchpad, Nodejs.org, Gradle or Jitpack. We will provide more updates as soon we get them. Thanks for your understanding.

Travis-CI
Network connectivity issues for public and private builds

Nov 30, 10:27 UTC Investigating - We’re currently investigating reports of network reachability issues that are causing some downloads to fail during build time. These reachability issues seem to be affecting downloads from Launchpad, Nodejs.org, Gradle or Jitpack. We will provide more updates as soon we get them. Thanks for your understanding.

Travis-CI
Container-based Linux reduced capacity

Nov 28, 06:30 UTC Resolved - This incident has been resolved.Nov 28, 05:17 UTC Update - Container-based Linux capacity for private repositories is currently offline, with fresh capacity on the way within 20m.Nov 28, 04:14 UTC Update - We are beginning to roll out configuration changes, which may result in delays.Nov 27, 23:35 UTC Monitoring - Capacity is back up, but we are planning to keep this incident open while we continue to monitor the NATs.Nov 27, 23:02 UTC Identified - We are seeing reduced capacity due to NAT stability issues, specifically for container-based Linux public repositories. We are in the process of preparing some network configuration and alerting changes that we expect will dramatically reduce the likelihood of this problem occurring again.

Travis-CI
Container-based Linux reduced capacity

Nov 28, 05:17 UTC Update - Container-based Linux capacity for private repositories is currently offline, with fresh capacity on the way within 20m.Nov 28, 04:14 UTC Update - We are beginning to roll out configuration changes, which may result in delays.Nov 27, 23:35 UTC Monitoring - Capacity is back up, but we are planning to keep this incident open while we continue to monitor the NATs.Nov 27, 23:02 UTC Identified - We are seeing reduced capacity due to NAT stability issues, specifically for container-based Linux public repositories. We are in the process of preparing some network configuration and alerting changes that we expect will dramatically reduce the likelihood of this problem occurring again.

Travis-CI
Container-based Linux reduced capacity

Nov 28, 04:14 UTC Update - We are beginning to roll out configuration changes, which may result in delays.Nov 27, 23:35 UTC Monitoring - Capacity is back up, but we are planning to keep this incident open while we continue to monitor the NATs.Nov 27, 23:02 UTC Identified - We are seeing reduced capacity due to NAT stability issues, specifically for container-based Linux public repositories. We are in the process of preparing some network configuration and alerting changes that we expect will dramatically reduce the likelihood of this problem occurring again.

Travis-CI
Container-based Linux reduced capacity

Nov 27, 23:35 UTC Monitoring - Capacity is back up, but we are planning to keep this incident open while we continue to monitor the NATs.Nov 27, 23:02 UTC Identified - We are seeing reduced capacity due to NAT stability issues, specifically for container-based Linux public repositories. We are in the process of preparing some network configuration and alerting changes that we expect will dramatically reduce the likelihood of this problem occurring again.

Travis-CI
Container-based Linux reduced capacity

Nov 27, 23:35 UTC Monitoring - Capacity is back up, but we are planning to keep this incident open while we continue to monitor the NATs.Nov 27, 23:02 UTC Identified - We are seeing reduced capacity due to NAT stability issues, specifically for container-based Linux public repositories. We are in the process of preparing some network configuration and alerting changes that we expect will dramatically reduce the likelihood of this problem occurring again.

Travis-CI
Container-based Linux reduced capacity

Nov 27, 23:02 UTC Identified - We are seeing reduced capacity due to NAT stability issues, specifically for container-based Linux public repositories. We are in the process of preparing some network configuration and alerting changes that we expect will dramatically reduce the likelihood of this problem occurring again.

Travis-CI
Container-based Linux reduced capacity

Nov 27, 19:22 UTC Resolved - Capacity is back online, and we are continuing to investigate contributing factors.Nov 27, 18:50 UTC Investigating - We are operating at reduced capacity on our infrastructure for container-based Linux. At this time, only public repositories are affected.

Travis-CI
Container-based Linux reduced capacity

Nov 27, 18:50 UTC Investigating - We are operating at reduced capacity on our infrastructure for container-based Linux. At this time, only public repositories are affected.

Travis-CI
Container-based Linux reduced capacity

Nov 27, 18:50 UTC Investigating - We are operating at reduced capacity on our infrastructure for container-based Linux. At this time, only public repositories are affected.

Travis-CI
Intermittent availability drops for sudo:false infrastructure

Nov 26, 02:32 UTC Resolved - The queue has returned to expected levels for this point of the week. We’ll continue to look into the causes for this situation.Nov 26, 01:57 UTC Monitoring - Availability seems to have stabilised, we will continue to monitor the situation.Nov 26, 01:13 UTC Investigating - We are currently experiencing intermittent drops in availability for our sudo: false infrastructure. At this time, we are investigating the factors that lead to this and will provide an update as soon as we are able to. During this time, there may be requeued builds that will take longer than usual to complete.

Travis-CI
Intermittent availability drops for sudo:false infrastructure

Nov 26, 01:57 UTC Monitoring - Availability seems to have stabilised, we will continue to monitor the situation.Nov 26, 01:13 UTC Investigating - We are currently experiencing intermittent drops in availability for our sudo: false infrastructure. At this time, we are investigating the factors that lead to this and will provide an update as soon as we are able to. During this time, there may be requeued builds that will take longer than usual to complete.

Travis-CI
Intermittent availability drops for sudo:false infrastructure

Nov 26, 01:13 UTC Investigating - We are currently experiencing intermittent drops in availability for our sudo: false infrastructure. At this time, we are investigating the factors that lead to this and will provide an update as soon as we are able to. During this time, there may be requeued builds that will take longer than usual to complete.

Travis-CI
Intermittent availability drops for sudo:false infrastructure

Nov 22, 08:24 UTC Resolved - The queue is back to normal and the intermittent drops in availability have stabilized. We will continue to keep an eye on this during the day.Nov 22, 01:02 UTC Monitoring - While we are still investigating this issue, it is now less prevalent. We are now monitoring the issue as we are still looking into the contributing factors that led to the issue appearing in the first place.Nov 21, 20:24 UTC Investigating - We are currently experiencing intermittent drops in availability for our sudo: false infrastructure. At this time, we are investigating the factors that lead to this and will provide an update as soon as we are able to. During this time, there may be requeued builds that will take longer than usual to complete.

Travis-CI
Intermittent availability drops for sudo:false infrastructure

Nov 22, 01:02 UTC Monitoring - While we are still investigating this issue, it is now less prevalent. We are now monitoring the issue as we are still looking into the contributing factors that led to the issue appearing in the first place.Nov 21, 20:24 UTC Investigating - We are currently experiencing intermittent drops in availability for our sudo: false infrastructure. At this time, we are investigating the factors that lead to this and will provide an update as soon as we are able to. During this time, there may be requeued builds that will take longer than usual to complete.

Travis-CI
Intermittent availability drops for sudo:false infrastructure

Nov 21, 20:24 UTC Investigating - We are currently experiencing intermittent drops in availability for our sudo: false infrastructure. At this time, we are investigating the factors that lead to this and will provide an update as soon as we are able to. During this time, there may be requeued builds that will take longer than usual to complete.

Travis-CI
Build interruptions for builds using apt-get

Nov 16, 21:10 UTC Resolved - This incident has been resolved. Builds are processing normally.Nov 16, 20:49 UTC Monitoring - We have pushed a fix to production and are continuing to monitor to ensure this is resolved.Nov 16, 19:15 UTC Investigating - Builds relying on apt-get are currently failing due to a key change in the HHVM apt repository. We are currently working to implement a fix.

Travis-CI
Build interruptions for builds using apt-get

Nov 16, 20:49 UTC Monitoring - We have pushed a fix to production and are continuing to monitor to ensure this is resolved.Nov 16, 19:15 UTC Investigating - Builds relying on apt-get are currently failing due to a key change in the HHVM apt repository. We are currently working to implement a fix.

Travis-CI
Build interruptions for builds using apt-get

Nov 16, 19:15 UTC Investigating - Builds relying on apt-get are currently failing due to a key change in the HHVM apt repository. We are currently working to implement a fix.

Travis-CI
Networking issues in our MacOS infrastructure are causing builds to fail

Nov 9, 16:11 UTC Resolved - We've brought OSX builds for public and private repositories back to full capacity.Nov 9, 14:13 UTC Monitoring - The elevated error rate of MacOS builds seems to have stabilized. We continue to monitor the situation.Nov 9, 13:06 UTC Investigating - The maintenance work performed earlier today seems to have left some unresolved networking problems. No OSX builds are currently passing. We are investigating the situation.

Travis-CI
Networking issues in our MacOS infrastructure are causing builds to fail

Nov 9, 14:13 UTC Monitoring - The elevated error rate of MacOS builds seems to have stabilized. We continue to monitor the situation.Nov 9, 13:06 UTC Investigating - The maintenance work performed earlier today seems to have left some unresolved networking problems. No OSX builds are currently passing. We are investigating the situation.

Travis-CI
Networking issues in our MacOS infrastructure are causing builds to fail

Nov 9, 13:06 UTC Investigating - The maintenance work performed earlier today seems to have left some unresolved networking problems. No OSX builds are currently passing. We are investigating the situation.

Travis-CI
Build delays on macOS infrastructure in travis-ci.com and travis-ci.org

Nov 9, 11:18 UTC Resolved - Everything is back to normal. The macOS backlog has cleared. Thanks for bearing with us.Nov 9, 10:38 UTC Monitoring - MacOS builds are running again at full capacity. We are processing through our backlog, it will take a while before it is cleared. We'll continue to monitor things closely.Nov 9, 10:20 UTC Update - MacOS builds are still not running. We are working with our upstream provider to get things going again as fast as possible.Nov 9, 09:37 UTC Update - The maintenance work takes longer than expected. For more information you can check the status page: http://status.macstadium.com/incidents/qbvgd7gc7dk2Nov 9, 09:00 UTC Identified - Macs builds have been put on hold due to service provider's maintenance

Travis-CI
Build delays on macOS infrastructure in travis-ci.com and travis-ci.org

Nov 9, 10:38 UTC Monitoring - MacOS builds are running again at full capacity. We are processing through our backlog, it will take a while before it is cleared. We'll continue to monitor things closely.Nov 9, 10:20 UTC Update - MacOS builds are still not running. We are working with our upstream provider to get things going again as fast as possible.Nov 9, 09:37 UTC Update - The maintenance work takes longer than expected. For more information you can check the status page: http://status.macstadium.com/incidents/qbvgd7gc7dk2Nov 9, 09:00 UTC Identified - Macs builds have been put on hold due to service provider's maintenance

Travis-CI
Build delays on macOS infrastructure in travis-ci.com and travis-ci.org

Nov 9, 10:20 UTC Update - MacOS builds are still not running. We are working with our upstream provider to get things going again as fast as possible.Nov 9, 09:37 UTC Update - The maintenance work takes longer than expected. For more information you can check the status page: http://status.macstadium.com/incidents/qbvgd7gc7dk2Nov 9, 09:00 UTC Identified - Macs builds have been put on hold due to service provider's maintenance

Travis-CI
Build delays on macOS infrastructure in travis-ci.com and travis-ci.org

Nov 9, 09:37 UTC Update - The maintenance work takes longer than expected. For more information you can check the status page: http://status.macstadium.com/incidents/qbvgd7gc7dk2Nov 9, 09:00 UTC Identified - Macs builds have been put on hold due to service provider's maintenance

Travis-CI
Build delays on macOS infrastructure in travis-ci.com and travis-ci.org

Nov 9, 09:37 UTC Update - Maintenance work takes longer than expected. For more informations you can check the status page: http://status.macstadium.com/incidents/qbvgd7gc7dk2Nov 9, 09:00 UTC Identified - Macs builds have been put on hold due to service provider's maintenance

Travis-CI
Build delays on macOS infrastructure in travis-ci.com and travis-ci.org

Nov 9, 09:00 UTC Identified - Macs builds have been put on hold due to service provider's maintenance

Travis-CI
Build delays on macOS infrastructure in travis-ci.com and travis-ci.org

Nov 9, 09:00 UTC Identified - Macs builds have been put on hold due to service provider's maintenance

Travis-CI
Reduced capacity of macOS builds

Nov 7, 21:41 UTC Resolved - We've brought OSX builds for public and private repositories back to full capacity. Backlog for private builds should be gone momentarily.Nov 7, 18:54 UTC Update - We're still working to stabilize resources, Mac builds will continue to run at a reduced capacity.Nov 7, 17:11 UTC Update - Reduced capacity on our macOS infrastructure continues. We are still working on a fix and will update soon.Nov 7, 16:12 UTC Identified - We've identified an issue with macOS builds that is currently causing reduced capacity. We're working to rectify it and will update shortly.

Travis-CI
Reduced capacity of macOS builds

Nov 7, 18:54 UTC Update - We're still working to stabilize resources, Mac builds will continue to run at a reduced capacity.Nov 7, 17:11 UTC Update - Reduced capacity on our macOS infrastructure continues. We are still working on a fix and will update soon.Nov 7, 16:12 UTC Identified - We've identified an issue with macOS builds that is currently causing reduced capacity. We're working to rectify it and will update shortly.

Travis-CI
Reduced capacity of macOS builds

Nov 7, 17:11 UTC Update - Reduced capacity on our macOS infrastructure continues. We are still working on a fix and will update soon.Nov 7, 16:12 UTC Identified - We've identified an issue with macOS builds that is currently causing reduced capacity. We're working to rectify it and will update shortly.

Travis-CI
Reduced capacity of macOS builds

Nov 7, 16:12 UTC Identified - We've identified an issue with macOS builds that is currently causing reduced capacity. We're working to rectify it and will update shortly.

Travis-CI
Upstream Carrier Emergency Maintenence affecting macOS builds

Nov 3, 09:05 UTC Resolved - The maintenance is complete. Everything is operating normally.Nov 3, 08:55 UTC Update - Our capacity is back, and we are starting to process macOS builds again. We continue to monitor, to make sure things are running smoothly.Nov 3, 08:35 UTC Monitoring - An upstream maintenance is preventing open-source and private macOS builds from starting. The maintenance window is 1 hour, outages are expected to add up to 10 minutes total. We're monitoring things closely.

Travis-CI
Upstream Carrier Emergency Maintenence affecting macOS builds

Nov 3, 08:55 UTC Update - Our capacity is back, and we are starting to process macOS builds again. We continue to monitor, to make sure things are running smoothly.Nov 3, 08:35 UTC Monitoring - An upstream maintenance is preventing open-source and private macOS builds from starting. The maintenance window is 1 hour, outages are expected to add up to 10 minutes total. We're monitoring things closely.

Travis-CI
Upstream Carrier Emergency Maintenence affecting macOS builds

Nov 3, 08:35 UTC Monitoring - An upstream maintenance is preventing open-source and private macOS builds from starting. The maintenance window is 1 hour, outages are expected to add up to 10 minutes total. We're monitoring things closely.

Travis-CI
Reduced performance on private sudo: enabled repositories

Nov 2, 19:20 UTC Resolved - We have increased our available resources, and all queues are back to normal.Nov 2, 18:57 UTC Investigating - We have noticed degraded performance on private sudo: enabled repositories, and we are working to identify and resolve this issue.

Travis-CI
Reduced performance on private sudo: enabled repositories

Nov 2, 18:57 UTC Investigating - We have noticed degraded performance on private sudo: enabled repositories, and we are working to identify and resolve this issue.

Travis-CI
Decreased capacity container jobs

Nov 2, 12:53 UTC Resolved - Everything is operating normally.Nov 2, 11:06 UTC Monitoring - Our workaround is in place and new capacity has come online successfully. You should no longer see delays at this point. We're monitoring the situation closely.Nov 2, 10:22 UTC Identified - We identified a problem with pulling docker images from the registry. We're in the process of switching to another registry to mitigate the issue.Nov 2, 09:41 UTC Investigating - We are investigating issues with scaling up capacity for container builds. This will likely result in delays in processing jobs.

Travis-CI
Decreased capacity container jobs

Nov 2, 11:06 UTC Monitoring - Our workaround is in place and new capacity has come online successfully. You should no longer see delays at this point. We're monitoring the situation closely.Nov 2, 10:22 UTC Identified - We identified a problem with pulling docker images from the registry. We're in the process of switching to another registry to mitigate the issue.Nov 2, 09:41 UTC Investigating - We are investigating issues with scaling up capacity for container builds. This will likely result in delays in processing jobs.

Travis-CI
Decreased capacity container jobs

Nov 2, 10:22 UTC Identified - We identified a problem with pulling docker images from the registry. We're in the process of switching to another registry to mitigate the issue.Nov 2, 09:41 UTC Investigating - We are investigating issues with scaling up capacity for container builds. This will likely result in delays in processing jobs.

Travis-CI
Decreased capacity container jobs

Nov 2, 09:41 UTC Investigating - We are investigating issues with scaling up capacity for container builds. This will likely result in delays in processing jobs.

Travis-CI
GitHub User Sync Delays

Oct 31, 22:58 UTC Resolved - GitHub user sync tasks are processing normally.Oct 31, 20:41 UTC Investigating - We are currently investigating delays when syncing users from GitHub.

Travis-CI
GitHub User Sync Delays

Oct 31, 20:41 UTC Investigating - We are currently investigating delays when syncing users from GitHub.

Travis-CI
sudo:required Capacity Issue on Linux

Oct 31, 19:49 UTC Resolved - We have recovered capacity to the previous levels, and have seen healthy operation for the past 12 hours. Thank you for your patience.Oct 31, 05:13 UTC Monitoring - At this stage, we have restored most of our capacity and we are monitoring closely for any further signs of capacity issues occurring, and are reaching out to our partners to help isolate and prevent further recurrences.Oct 30, 23:57 UTC Update - We are currently restoring services piece by piece, and are seeing positive results at this time. However, we are still monitoring closely.Oct 30, 22:35 UTC Update - Resources are being cleared up, and a tentative estimation is that they should be cleared up in approximately one hour. We will update as soon as we have any further updates on this issue.Oct 30, 22:08 UTC Update - We are currently working on clearing up available resources so that we can bring the services back online with full capacity, and are still investigating the factors that led to this issue.Oct 30, 21:39 UTC Update - We are currently restarting all of our GCE instances to restore them to full service. We are still investigating the contributing factors leading to this issue, and the next update will follow in 30 minutes.Oct 30, 21:07 UTC Update - Due to ongoing issues, we are performing emergency maintenance on our GCE (sudo: required) infrastructure. During this time, service will be significantly slower or not function. We will provide an update within 30 minutes.Oct 30, 20:41 UTC Investigating - We are seeing queues once more, and are continuing to investigate the issue. At this stage, we are seeing missing capacity and are working on restoring it.Oct 30, 19:49 UTC Monitoring - We are seeing that the queues and backlogs are clearing up, but we are still investigating the issue for further details and to confirm it has been fully resolved.Oct 30, 17:37 UTC Investigating - We are currently investigating a capacity issue on our open source infrastructure, related to a configuration change. This affects sudo:required (GCE) builds. We are deploying a new configuration that we hope will resolve this issue.

Travis-CI
Open-source builds erroring

Oct 31, 09:51 UTC Resolved - Our service has recovered, everything is operating normally. 💛Oct 31, 09:01 UTC Monitoring - The build script generator is up. Public builds should be working correctly now. If you have any builds that failed please try restarting them.Oct 31, 06:51 UTC Update - The build script generator is returning errors for public builds, which causes the builds to fail. We are working with our upstream provider to resolve this situation.Oct 31, 06:39 UTC Identified - Due to an upstream issue with our infrastructure provider, new builds on travis-ci.org are not starting.

Travis-CI
Open-source builds erroring

Oct 31, 09:01 UTC Monitoring - The build script generator is up. Public builds should be working correctly now. If you have any builds that failed please try restarting them.Oct 31, 06:51 UTC Update - The build script generator is returning errors for public builds, which causes the builds to fail. We are working with our upstream provider to resolve this situation.Oct 31, 06:39 UTC Identified - Due to an upstream issue with our infrastructure provider, new builds on travis-ci.org are not starting.

Travis-CI
Open-source builds erroring

Oct 31, 06:51 UTC Update - The build script generator is returning errors for public builds, which causes the builds to fail. We are working with our upstream provider to resolve this situation.Oct 31, 06:39 UTC Identified - Due to an upstream issue with our infrastructure provider, new builds on travis-ci.org are not starting.

Travis-CI
Open-source builds not starting

Oct 31, 06:39 UTC Identified - Due to an upstream issue with our infrastructure provider, new builds on travis-ci.org are not starting.

Travis-CI
sudo:required Capacity Issue on Linux

Oct 31, 05:13 UTC Monitoring - At this stage, we have restored most of our capacity and we are monitoring closely for any further signs of capacity issues occurring, and are reaching out to our partners to help isolate and prevent further recurrences.Oct 30, 23:57 UTC Update - We are currently restoring services piece by piece, and are seeing positive results at this time. However, we are still monitoring closely.Oct 30, 22:35 UTC Update - Resources are being cleared up, and a tentative estimation is that they should be cleared up in approximately one hour. We will update as soon as we have any further updates on this issue.Oct 30, 22:08 UTC Update - We are currently working on clearing up available resources so that we can bring the services back online with full capacity, and are still investigating the factors that led to this issue.Oct 30, 21:39 UTC Update - We are currently restarting all of our GCE instances to restore them to full service. We are still investigating the contributing factors leading to this issue, and the next update will follow in 30 minutes.Oct 30, 21:07 UTC Update - Due to ongoing issues, we are performing emergency maintenance on our GCE (sudo: required) infrastructure. During this time, service will be significantly slower or not function. We will provide an update within 30 minutes.Oct 30, 20:41 UTC Investigating - We are seeing queues once more, and are continuing to investigate the issue. At this stage, we are seeing missing capacity and are working on restoring it.Oct 30, 19:49 UTC Monitoring - We are seeing that the queues and backlogs are clearing up, but we are still investigating the issue for further details and to confirm it has been fully resolved.Oct 30, 17:37 UTC Investigating - We are currently investigating a capacity issue on our open source infrastructure, related to a configuration change. This affects sudo:required (GCE) builds. We are deploying a new configuration that we hope will resolve this issue.

Travis-CI
sudo:required Capacity Issue on Linux

Oct 30, 23:57 UTC Update - We are currently restoring services piece by piece, and are seeing positive results at this time. However, we are still monitoring closely.Oct 30, 22:35 UTC Update - Resources are being cleared up, and a tentative estimation is that they should be cleared up in approximately one hour. We will update as soon as we have any further updates on this issue.Oct 30, 22:08 UTC Update - We are currently working on clearing up available resources so that we can bring the services back online with full capacity, and are still investigating the factors that led to this issue.Oct 30, 21:39 UTC Update - We are currently restarting all of our GCE instances to restore them to full service. We are still investigating the contributing factors leading to this issue, and the next update will follow in 30 minutes.Oct 30, 21:07 UTC Update - Due to ongoing issues, we are performing emergency maintenance on our GCE (sudo: required) infrastructure. During this time, service will be significantly slower or not function. We will provide an update within 30 minutes.Oct 30, 20:41 UTC Investigating - We are seeing queues once more, and are continuing to investigate the issue. At this stage, we are seeing missing capacity and are working on restoring it.Oct 30, 19:49 UTC Monitoring - We are seeing that the queues and backlogs are clearing up, but we are still investigating the issue for further details and to confirm it has been fully resolved.Oct 30, 17:37 UTC Investigating - We are currently investigating a capacity issue on our open source infrastructure, related to a configuration change. This affects sudo:required (GCE) builds. We are deploying a new configuration that we hope will resolve this issue.

Travis-CI
sudo:required Capacity Issue on Linux

Oct 30, 22:35 UTC Update - Resources are being cleared up, and a tentative estimation is that they should be cleared up in approximately one hour. We will update as soon as we have any further updates on this issue.Oct 30, 22:08 UTC Update - We are currently working on clearing up available resources so that we can bring the services back online with full capacity, and are still investigating the factors that led to this issue.Oct 30, 21:39 UTC Update - We are currently restarting all of our GCE instances to restore them to full service. We are still investigating the contributing factors leading to this issue, and the next update will follow in 30 minutes.Oct 30, 21:07 UTC Update - Due to ongoing issues, we are performing emergency maintenance on our GCE (sudo: required) infrastructure. During this time, service will be significantly slower or not function. We will provide an update within 30 minutes.Oct 30, 20:41 UTC Investigating - We are seeing queues once more, and are continuing to investigate the issue. At this stage, we are seeing missing capacity and are working on restoring it.Oct 30, 19:49 UTC Monitoring - We are seeing that the queues and backlogs are clearing up, but we are still investigating the issue for further details and to confirm it has been fully resolved.Oct 30, 17:37 UTC Investigating - We are currently investigating a capacity issue on our open source infrastructure, related to a configuration change. This affects sudo:required (GCE) builds. We are deploying a new configuration that we hope will resolve this issue.

Travis-CI
sudo:required Capacity Issue on Linux

Oct 30, 22:08 UTC Update - We are currently working on clearing up available resources so that we can bring the services back online with full capacity, and are still investigating the factors that led to this issue.Oct 30, 21:39 UTC Update - We are currently restarting all of our GCE instances to restore them to full service. We are still investigating the contributing factors leading to this issue, and the next update will follow in 30 minutes.Oct 30, 21:07 UTC Update - Due to ongoing issues, we are performing emergency maintenance on our GCE (sudo: required) infrastructure. During this time, service will be significantly slower or not function. We will provide an update within 30 minutes.Oct 30, 20:41 UTC Investigating - We are seeing queues once more, and are continuing to investigate the issue. At this stage, we are seeing missing capacity and are working on restoring it.Oct 30, 19:49 UTC Monitoring - We are seeing that the queues and backlogs are clearing up, but we are still investigating the issue for further details and to confirm it has been fully resolved.Oct 30, 17:37 UTC Investigating - We are currently investigating a capacity issue on our open source infrastructure, related to a configuration change. This affects sudo:required (GCE) builds. We are deploying a new configuration that we hope will resolve this issue.

Travis-CI
sudo:required Capacity Issue on Linux

Oct 30, 21:39 UTC Update - We are currently restarting all of our GCE instances to restore them to full service. We are still investigating the contributing factors leading to this issue, and the next update will follow in 30 minutes.Oct 30, 21:07 UTC Update - Due to ongoing issues, we are performing emergency maintenance on our GCE (sudo: required) infrastructure. During this time, service will be significantly slower or not function. We will provide an update within 30 minutes.Oct 30, 20:41 UTC Investigating - We are seeing queues once more, and are continuing to investigate the issue. At this stage, we are seeing missing capacity and are working on restoring it.Oct 30, 19:49 UTC Monitoring - We are seeing that the queues and backlogs are clearing up, but we are still investigating the issue for further details and to confirm it has been fully resolved.Oct 30, 17:37 UTC Investigating - We are currently investigating a capacity issue on our open source infrastructure, related to a configuration change. This affects sudo:required (GCE) builds. We are deploying a new configuration that we hope will resolve this issue.

Travis-CI
sudo:required Capacity Issue on Open Source Linux

Oct 30, 21:39 UTC Update - We are currently restarting all of our GCE instances to restore them to full service. We are still investigating the contributing factors leading to this issue, and the next update will follow in 30 minutes.Oct 30, 21:07 UTC Update - Due to ongoing issues, we are performing emergency maintenance on our GCE (sudo: required) infrastructure. During this time, service will be significantly slower or not function. We will provide an update within 30 minutes.Oct 30, 20:41 UTC Investigating - We are seeing queues once more, and are continuing to investigate the issue. At this stage, we are seeing missing capacity and are working on restoring it.Oct 30, 19:49 UTC Monitoring - We are seeing that the queues and backlogs are clearing up, but we are still investigating the issue for further details and to confirm it has been fully resolved.Oct 30, 17:37 UTC Investigating - We are currently investigating a capacity issue on our open source infrastructure, related to a configuration change. This affects sudo:required (GCE) builds. We are deploying a new configuration that we hope will resolve this issue.

Travis-CI
sudo:required Capacity Issue on Open Source Linux

Oct 30, 21:07 UTC Update - Due to ongoing issues, we are performing emergency maintenance on our GCE (sudo: required) infrastructure. During this time, service will be significantly slower or not function. We will provide an update within 30 minutes.Oct 30, 20:41 UTC Investigating - We are seeing queues once more, and are continuing to investigate the issue. At this stage, we are seeing missing capacity and are working on restoring it.Oct 30, 19:49 UTC Monitoring - We are seeing that the queues and backlogs are clearing up, but we are still investigating the issue for further details and to confirm it has been fully resolved.Oct 30, 17:37 UTC Investigating - We are currently investigating a capacity issue on our open source infrastructure, related to a configuration change. This affects sudo:required (GCE) builds. We are deploying a new configuration that we hope will resolve this issue.

Travis-CI
sudo:required Capacity Issue on Open Source Linux

Oct 30, 21:07 UTC Update - Due to ongoing issues, we are performing emergency maintenance on our GCE (sudo: required) infrastructure. During this time, service will be significantly slower or not function. We will provide an update within 30 minutes.Oct 30, 20:41 UTC Investigating - We are seeing queues once more, and are continuing to investigate the issue. At this stage, we are seeing missing capacity and are working on restoring it.Oct 30, 19:49 UTC Monitoring - We are seeing that the queues and backlogs are clearing up, but we are still investigating the issue for further details and to confirm it has been fully resolved.Oct 30, 17:37 UTC Investigating - We are currently investigating a capacity issue on our open source infrastructure, related to a configuration change. This affects sudo:required (GCE) builds. We are deploying a new configuration that we hope will resolve this issue.

Travis-CI
sudo:required Capacity Issue on Open Source Linux

Oct 30, 20:41 UTC Investigating - We are seeing queues once more, and are continuing to investigate the issue. At this stage, we are seeing missing capacity and are working on restoring it.Oct 30, 19:49 UTC Monitoring - We are seeing that the queues and backlogs are clearing up, but we are still investigating the issue for further details and to confirm it has been fully resolved.Oct 30, 17:37 UTC Investigating - We are currently investigating a capacity issue on our open source infrastructure, related to a configuration change. This affects sudo:required (GCE) builds. We are deploying a new configuration that we hope will resolve this issue.

Travis-CI
sudo:required Capacity Issue on Open Source Linux

Oct 30, 19:49 UTC Monitoring - We are seeing that the queues and backlogs are clearing up, but we are still investigating the issue for further details and to confirm it has been fully resolved.Oct 30, 17:37 UTC Investigating - We are currently investigating a capacity issue on our open source infrastructure, related to a configuration change. This affects sudo:required (GCE) builds. We are deploying a new configuration that we hope will resolve this issue.

Travis-CI
sudo:required Capacity Issue on Open Source Linux

Oct 30, 17:37 UTC Investigating - We are currently investigating a capacity issue on our open source infrastructure, related to a configuration change. This affects sudo:required (GCE) builds. We are deploying a new configuration that we hope will resolve this issue.

Travis-CI
Capacity Issue on Open Source Linux

Oct 30, 17:37 UTC Investigating - We are currently investigating a capacity issue on our open source infrastructure, related to a configuration change. We are deploying a new configuration that we hope will resolve this issue.

Travis-CI
Capacity container lags behind demand

Oct 26, 16:48 UTC Resolved - Problems with fetching larger images from Docker Hub appear to have been resolved. Thank you for your patience.Oct 26, 11:16 UTC Monitoring - We have successfully increased build processing capacity for private repositories (travis-ci.com), and have cleared the backlog of builds. We will be continuing to monitor the situation until the underlying issues with Docker Hub are resolved.Oct 26, 10:00 UTC Identified - Problems pulling images from the Docker Hub are affecting our ability to increase capacity for .com build processing. We are in the process of implementing a workaround that will allow us to scale up and handle jobs that are currently queued.Oct 26, 08:16 UTC Investigating - We see issues increasing capacity for container builds for .org and .com.

Travis-CI
Network issues downloading images from Docker Hub

Oct 26, 16:47 UTC Resolved - Problems with fetching larger images from Docker Hub appear to have been resolved. Thank you for your patience.Oct 26, 16:16 UTC Monitoring - The upstream incident has been resolved. We are continuing to monitor the situation.Oct 26, 13:49 UTC Identified - The issue has been acknowledged upstream: https://status.docker.com/pages/incident/533c6539221ae15e3f000031/59f1da512cd214649ebc33b0Oct 26, 12:33 UTC Update - While problems pulling images from Docker Hub persist, we are now working directly with engineers at Docker to resolve this issue: https://github.com/docker/hub-feedback/issues/1225Oct 26, 07:41 UTC Update - We are continuing to investigate this issue with our infrastructure provider. We don't have a timescale for resolution, but we will continue updating the status over the course of the day.Oct 26, 01:20 UTC Update - We are still working to identify the contributing factors with help from our infrastructure provider. Download throughput through CloudFront host dseasb33srnrn.cloudfront.net continues to drop rapidly to 0 soon after starting transfer, sometimes temporarily recovering after several minutes. This continues to have an impact that correlates strongly with total download size.Oct 25, 17:59 UTC Update - We are continuing to investigate this issue with our infrastructure provider.Oct 25, 15:29 UTC Update - We are continuing to work with our infrastructure provider to identify the cause of network problems affecting communication with the Docker Hub.Oct 25, 12:51 UTC Update - We've been in contact with our infrastructure provider to investigate network connectivity on sudo required builds using docker service add on.Oct 25, 09:47 UTC Investigating - We’re investigating reports of timeouts in builds while pulling images from Docker Hub.

Travis-CI
Network issues downloading images from Docker Hub

Oct 26, 16:16 UTC Monitoring - The upstream incident has been resolved. We are continuing to monitor the situation.Oct 26, 13:49 UTC Identified - The issue has been acknowledged upstream: https://status.docker.com/pages/incident/533c6539221ae15e3f000031/59f1da512cd214649ebc33b0Oct 26, 12:33 UTC Update - While problems pulling images from Docker Hub persist, we are now working directly with engineers at Docker to resolve this issue: https://github.com/docker/hub-feedback/issues/1225Oct 26, 07:41 UTC Update - We are continuing to investigate this issue with our infrastructure provider. We don't have a timescale for resolution, but we will continue updating the status over the course of the day.Oct 26, 01:20 UTC Update - We are still working to identify the contributing factors with help from our infrastructure provider. Download throughput through CloudFront host dseasb33srnrn.cloudfront.net continues to drop rapidly to 0 soon after starting transfer, sometimes temporarily recovering after several minutes. This continues to have an impact that correlates strongly with total download size.Oct 25, 17:59 UTC Update - We are continuing to investigate this issue with our infrastructure provider.Oct 25, 15:29 UTC Update - We are continuing to work with our infrastructure provider to identify the cause of network problems affecting communication with the Docker Hub.Oct 25, 12:51 UTC Update - We've been in contact with our infrastructure provider to investigate network connectivity on sudo required builds using docker service add on.Oct 25, 09:47 UTC Investigating - We’re investigating reports of timeouts in builds while pulling images from Docker Hub.

Travis-CI
Network issues downloading images from Docker Hub

Oct 26, 13:49 UTC Identified - The issue has been acknowledged upstream: https://status.docker.com/pages/incident/533c6539221ae15e3f000031/59f1da512cd214649ebc33b0Oct 26, 12:33 UTC Update - While problems pulling images from Docker Hub persist, we are now working directly with engineers at Docker to resolve this issue: https://github.com/docker/hub-feedback/issues/1225Oct 26, 07:41 UTC Update - We are continuing to investigate this issue with our infrastructure provider. We don't have a timescale for resolution, but we will continue updating the status over the course of the day.Oct 26, 01:20 UTC Update - We are still working to identify the contributing factors with help from our infrastructure provider. Download throughput through CloudFront host dseasb33srnrn.cloudfront.net continues to drop rapidly to 0 soon after starting transfer, sometimes temporarily recovering after several minutes. This continues to have an impact that correlates strongly with total download size.Oct 25, 17:59 UTC Update - We are continuing to investigate this issue with our infrastructure provider.Oct 25, 15:29 UTC Update - We are continuing to work with our infrastructure provider to identify the cause of network problems affecting communication with the Docker Hub.Oct 25, 12:51 UTC Update - We've been in contact with our infrastructure provider to investigate network connectivity on sudo required builds using docker service add on.Oct 25, 09:47 UTC Investigating - We’re investigating reports of timeouts in builds while pulling images from Docker Hub.

Travis-CI
Network issues downloading images from Docker Hub

Oct 26, 12:33 UTC Update - While problems pulling images from Docker Hub persist, we are now working directly with engineers at Docker to resolve this issue: https://github.com/docker/hub-feedback/issues/1225Oct 26, 07:41 UTC Update - We are continuing to investigate this issue with our infrastructure provider. We don't have a timescale for resolution, but we will continue updating the status over the course of the day.Oct 26, 01:20 UTC Update - We are still working to identify the contributing factors with help from our infrastructure provider. Download throughput through CloudFront host dseasb33srnrn.cloudfront.net continues to drop rapidly to 0 soon after starting transfer, sometimes temporarily recovering after several minutes. This continues to have an impact that correlates strongly with total download size.Oct 25, 17:59 UTC Update - We are continuing to investigate this issue with our infrastructure provider.Oct 25, 15:29 UTC Update - We are continuing to work with our infrastructure provider to identify the cause of network problems affecting communication with the Docker Hub.Oct 25, 12:51 UTC Update - We've been in contact with our infrastructure provider to investigate network connectivity on sudo required builds using docker service add on.Oct 25, 09:47 UTC Investigating - We’re investigating reports of timeouts in builds while pulling images from Docker Hub.

Travis-CI
Capacity container lags behind demand

Oct 26, 11:16 UTC Monitoring - We have successfully increased build processing capacity for private repositories (travis-ci.com), and have cleared the backlog of builds. We will be continuing to monitor the situation until the underlying issues with Docker Hub are resolved.Oct 26, 10:00 UTC Identified - Problems pulling images from the Docker Hub are affecting our ability to increase capacity for .com build processing. We are in the process of implementing a workaround that will allow us to scale up and handle jobs that are currently queued.Oct 26, 08:16 UTC Investigating - We see issues increasing capacity for container builds for .org and .com.

Travis-CI
Capacity container lags behind demand

Oct 26, 10:00 UTC Identified - Problems pulling images from the Docker Hub are affecting our ability to increase capacity for .com build processing. We are in the process of implementing a workaround that will allow us to scale up and handle jobs that are currently queued.Oct 26, 08:16 UTC Investigating - We see issues increasing capacity for container builds for .org and .com.

Travis-CI
Capacity container lags behind demand

Oct 26, 08:16 UTC Investigating - We see issues increasing capacity for container builds for .org and .com.

Travis-CI
Network issues downloading images from Docker Hub

Oct 26, 07:41 UTC Update - We are continuing to investigate this issue with our infrastructure provider. We don't have a timescale for resolution, but we will continue updating the status over the course of the day.Oct 26, 01:20 UTC Update - We are still working to identify the contributing factors with help from our infrastructure provider. Download throughput through CloudFront host dseasb33srnrn.cloudfront.net continues to drop rapidly to 0 soon after starting transfer, sometimes temporarily recovering after several minutes. This continues to have an impact that correlates strongly with total download size.Oct 25, 17:59 UTC Update - We are continuing to investigate this issue with our infrastructure provider.Oct 25, 15:29 UTC Update - We are continuing to work with our infrastructure provider to identify the cause of network problems affecting communication with the Docker Hub.Oct 25, 12:51 UTC Update - We've been in contact with our infrastructure provider to investigate network connectivity on sudo required builds using docker service add on.Oct 25, 09:47 UTC Investigating - We’re investigating reports of timeouts in builds while pulling images from Docker Hub.

Travis-CI
Network issues downloading images from Docker Hub

Oct 26, 01:20 UTC Update - We are still working to identify the contributing factors with help from our infrastructure provider. Download throughput through CloudFront host dseasb33srnrn.cloudfront.net continues to drop rapidly to 0 soon after starting transfer, sometimes temporarily recovering after several minutes. This continues to have an impact that correlates strongly with total download size.Oct 25, 17:59 UTC Update - We are continuing to investigate this issue with our infrastructure provider.Oct 25, 15:29 UTC Update - We are continuing to work with our infrastructure provider to identify the cause of network problems affecting communication with the Docker Hub.Oct 25, 12:51 UTC Update - We've been in contact with our infrastructure provider to investigate network connectivity on sudo required builds using docker service add on.Oct 25, 09:47 UTC Investigating - We’re investigating reports of timeouts in builds while pulling images from Docker Hub.

Travis-CI
Network issues downloading images from Docker Hub

Oct 25, 17:59 UTC Update - We are continuing to investigate this issue with our infrastructure provider.Oct 25, 15:29 UTC Update - We are continuing to work with our infrastructure provider to identify the cause of network problems affecting communication with the Docker Hub.Oct 25, 12:51 UTC Update - We've been in contact with our infrastructure provider to investigate network connectivity on sudo required builds using docker service add on.Oct 25, 09:47 UTC Investigating - We’re investigating reports of timeouts in builds while pulling images from Docker Hub.

Travis-CI
Network issues downloading images from Docker Hub

Oct 25, 15:29 UTC Update - We are continuing to work with our infrastructure provider to identify the cause of network problems affecting communication with the Docker Hub.Oct 25, 12:51 UTC Update - We've been in contact with our infrastructure provider to investigate network connectivity on sudo required builds using docker service add on.Oct 25, 09:47 UTC Investigating - We’re investigating reports of timeouts in builds while pulling images from Docker Hub.

Travis-CI
Network issues downloading images from Docker Hub

Oct 25, 15:29 UTC Update - We are continuing to work with our infrastructure provider to identify the cause of network problems affecting communication with the Docker Hub.Oct 25, 12:51 UTC Update - We've been in contact with our infrastructure provider to investigate network connectivity on sudo required builds using docker service add on.Oct 25, 09:47 UTC Investigating - We’re investigating reports of timeouts in builds while pulling images from Docker Hub.

Travis-CI
Network issues downloading images from Docker Hub

Oct 25, 12:51 UTC Update - We've been in contact with our infrastructure provider to investigate network connectivity on sudo required builds using docker service add on.Oct 25, 09:47 UTC Investigating - We’re investigating reports of timeouts in builds while pulling images from Docker Hub.

Travis-CI
Network issues downloading images from Docker Hub

Oct 25, 09:47 UTC Investigating - We’re investigating reports of timeouts in builds while pulling images from Docker Hub.

Travis-CI
Delays processing build logs and build statuses for private builds in travis-ci.com

Oct 21, 08:25 UTC Resolved - This incident has been resolved.Oct 20, 16:43 UTC Update - A fix has been implemented and we are monitoring the results.Oct 20, 16:43 UTC Monitoring - Our service has recovered. Build starts and state updates should not experience any further delays. We're monitoring the situation closely.Oct 20, 09:13 UTC Update - In order to mitigate a contention in our systems, we have had to restart some workers. Your builds on the container infrastructure might have been restarted in the process. Thanks for understanding.Oct 20, 08:14 UTC Investigating - We're currently investigating reports of delays processing build logs and build statuses for private repositories. We will provide updates as soon as we have them. Thank you for your patience.

Travis-CI
Delays processing build logs and build statuses for private builds in travis-ci.com

Oct 20, 16:43 UTC Update - A fix has been implemented and we are monitoring the results.Oct 20, 16:43 UTC Monitoring - Our service has recovered. Build starts and state updates should not experience any further delays. We're monitoring the situation closely.Oct 20, 09:13 UTC Update - In order to mitigate a contention in our systems, we have had to restart some workers. Your builds on the container infrastructure might have been restarted in the process. Thanks for understanding.Oct 20, 08:14 UTC Investigating - We're currently investigating reports of delays processing build logs and build statuses for private repositories. We will provide updates as soon as we have them. Thank you for your patience.

Travis-CI
Delays processing build logs and build statuses for private builds in travis-ci.com

Oct 20, 16:43 UTC Monitoring - Our service has recovered. Build starts and state updates should not experience any further delays. We're monitoring the situation closely.Oct 20, 09:13 UTC Update - In order to mitigate a contention in our systems, we have had to restart some workers. Your builds on the container infrastructure might have been restarted in the process. Thanks for understanding.Oct 20, 08:14 UTC Investigating - We're currently investigating reports of delays processing build logs and build statuses for private repositories. We will provide updates as soon as we have them. Thank you for your patience.

Travis-CI
Build statuses aren't updating on GitHub

Oct 20, 16:11 UTC Resolved - We believe this is a different manifestation of the issue we are already tracking here: https://www.traviscistatus.com/incidents/d7y02z19k0y6. Hence, we are closing this. Sorry for the confusion.Oct 20, 14:04 UTC Investigating - We've received multiple reports stating that build statuses aren't updating on GitHub. We are investigating why this is happening. Thank you for your patience.

Travis-CI
Build statuses aren't updating on GitHub

Oct 20, 14:04 UTC Investigating - We've received multiple reports stating that build statuses aren't updating on GitHub. We are investigating why this is happening. Thank you for your patience.

Travis-CI
Delays processing build logs and build statuses for private builds in travis-ci.com

Oct 20, 09:13 UTC Update - In order to mitigate a contention in our systems, we have had to restart some workers. Your builds on the container infrastructure might have been restarted in the process. Thanks for understanding.Oct 20, 08:14 UTC Investigating - We're currently investigating reports of delays processing build logs and build statuses for private repositories. We will provide updates as soon as we have them. Thank you for your patience.

Travis-CI
Delays processing build logs and build statuses for private builds in travis-ci.com

Oct 20, 08:14 UTC Investigating - We're currently investigating reports of delays processing build logs and build statuses for private repositories. We will provide updates as soon as we have them. Thank you for your patience.

Travis-CI
Delays processing build logs and build statuses for private builds in travis-ci.com

Oct 20, 08:14 UTC Investigating - We're currently investigating reports of delays processing build logs and build statuses for private repositories. We will provide updates as soon as we have them. Thank you for your patience.

Travis-CI
Builds using sudo apt-get update failing

Oct 18, 17:05 UTC Resolved - A fix has been deployed and the issue has been resolved.Oct 18, 15:41 UTC Investigating - There is currently an issue acquiring packages when using a sudo:required build. apt-get update fails with error 401 on Trusty, and 404 on Precise. We are currently working on preparing a different source for the package files to resolve this. Please see here for a workaround: https://github.com/travis-ci/travis-ci/issues/8607

Travis-CI
Builds using sudo apt-get update failing

Oct 18, 15:41 UTC Investigating - There is currently an issue acquiring packages when using a sudo:required build. apt-get update fails with error 401 on Trusty, and 404 on Precise. We are currently working on preparing a different source for the package files to resolve this. Please see here for a workaround: https://github.com/travis-ci/travis-ci/issues/8607

Travis-CI
macOS builds for private and public repositories undergoing Infrastructure Repairs

Oct 18, 00:35 UTC Resolved - This incident has been resolved.Oct 17, 21:16 UTC Monitoring - We've made the necessary repairs and restored service to our macOS infrastructure. MacOS builds for public and private builds are running at full capacity and working through the backlog.Oct 17, 20:44 UTC Identified - We're working on infrastructure repairs to our macOS infrastructure, MacOS builds for public and private repositories continue to run at a reduced capacity. Thank you for your patience as we work to get things back up.Oct 17, 17:39 UTC Investigating - Please stand by while we rebuild some VMs in our macOS infrastructure. Builds continue to process at a reduced capacity, however we expect longer queue times during this period.

Travis-CI
macOS builds for private and public repositories undergoing Infrastructure Repairs

Oct 17, 21:16 UTC Monitoring - We've made the necessary repairs and restored service to our macOS infrastructure. MacOS builds for public and private builds are running at full capacity and working through the backlog.Oct 17, 20:44 UTC Identified - We're working on infrastructure repairs to our macOS infrastructure, MacOS builds for public and private repositories continue to run at a reduced capacity. Thank you for your patience as we work to get things back up.Oct 17, 17:39 UTC Investigating - Please stand by while we rebuild some VMs in our macOS infrastructure. Builds continue to process at a reduced capacity, however we expect longer queue times during this period.

Travis-CI
macOS builds for private and public repositories undergoing Infrastructure Repairs

Oct 17, 20:44 UTC Identified - We're working on infrastructure repairs to our macOS infrastructure, MacOS builds for public and private repositories continue to run at a reduced capacity. Thank you for your patience as we work to get things back up.Oct 17, 17:39 UTC Investigating - Please stand by while we rebuild some VMs in our macOS infrastructure. Builds continue to process at a reduced capacity, however we expect longer queue times during this period.

Travis-CI
GitHub commit delays

Oct 17, 18:28 UTC Resolved - The GitHub Commit delays have been resolved on GitHub's side, and we are not seeing any issues on our end following this.Oct 17, 16:36 UTC Monitoring - Github has noted that the queue backlog is recovering. We are monitoring for any issues on our end following this. Please see here for more details: https://status.github.com/messagesOct 17, 15:12 UTC Investigating - GitHub is currently experiencing and working on resolving an issue causing a backlog in their commits. Builds and tests might be delayed until this issue is resolved. For more details, see GitHub’s page here: https://status.github.com/messages

Travis-CI
macOS builds for private and public repositories undergoing Infrastructure Repairs

Oct 17, 17:39 UTC Investigating - Please stand by while we rebuild some VMs in our macOS infrastructure. Builds continue to process at a reduced capacity, however we expect longer queue times during this period.

Travis-CI
GitHub commit delays

Oct 17, 16:36 UTC Monitoring - Github has noted that the queue backlog is recovering. We are monitoring for any issues on our end following this. Please see here for more details: https://status.github.com/messagesOct 17, 15:12 UTC Investigating - GitHub is currently experiencing and working on resolving an issue causing a backlog in their commits. Builds and tests might be delayed until this issue is resolved. For more details, see GitHub’s page here: https://status.github.com/messages

Travis-CI
GitHub commit delays

Oct 17, 15:12 UTC Investigating - GitHub is currently experiencing and working on resolving an issue causing a backlog in their commits. Builds and tests might be delayed until this issue is resolved. For more details, see GitHub’s page here: https://status.github.com/messages

Travis-CI
Reduced capacity for public and private macOS builds

Oct 11, 15:15 UTC Resolved - We've completed the needed changes. At this time OSX builds for private and public repositories are running at full capacity. Please email support@travis-ci.com if you have any questions.Oct 11, 12:12 UTC Investigating - We are continuing needed configuration changes on our macOS infrastructure for both private and public builds. During this time, we will be reducing capacity while we complete the work. You may experience job requeues and longer wait times. We currently do not have an ETA for when we'll return to full capacity, but we're working to restore as soon as possible. Thank you for your patience.

Travis-CI
Reduced capacity for public and private macOS builds

Oct 11, 12:12 UTC Investigating - We are continuing needed configuration changes on our macOS infrastructure for both private and public builds. During this time, we will be reducing capacity while we complete the work. You may experience job requeues and longer wait times. We currently do not have an ETA for when we'll return to full capacity, but we're working to restore as soon as possible. Thank you for your patience.

Travis-CI
Unplanned: Reduced capacity for public and private macOS builds

Oct 11, 00:59 UTC Resolved - This incident has been resolved.Oct 10, 22:06 UTC Update - We've completed a portion of the needed changes today, we'll be performing more maintenance for tomorrow between 1200 and 1600 UTC . At this time OSX builds for private and public repositories are running at full capacity. Please email support@travis-ci.com if you have any questions.Oct 10, 21:49 UTC Update - We're in the process of resuming full capacity for public and private builds.Oct 10, 18:51 UTC Identified - We are taking some unplanned period of reduced capacity for both public and private macOS builds, in order to implement some configuration changes needed to ensure our builds are running reliably. During this time you may also see your macOS jobs be requeued at certain times. We do not currently have an ETA for when we'll return to full capacity but we're working to restore it as soon as possible. Thank you for your patience.

Travis-CI
Unplanned: Reduced capacity for public and private macOS builds

Oct 10, 22:06 UTC Update - We've completed a portion of the needed changes today, we'll be performing more maintenance for tomorrow between 1200 and 1600 UTC . At this time OSX builds for private and public repositories are running at full capacity. Please email support@travis-ci.com if you have any questions.Oct 10, 21:49 UTC Update - We're in the process of resuming full capacity for public and private builds.Oct 10, 18:51 UTC Identified - We are taking some unplanned period of reduced capacity for both public and private macOS builds, in order to implement some configuration changes needed to ensure our builds are running reliably. During this time you may also see your macOS jobs be requeued at certain times. We do not currently have an ETA for when we'll return to full capacity but we're working to restore it as soon as possible. Thank you for your patience.

Travis-CI
Unplanned: Reduced capacity for public and private macOS builds

Oct 10, 21:49 UTC Update - We're in the process of resuming full capacity for public and private builds.Oct 10, 18:51 UTC Identified - We are taking some unplanned period of reduced capacity for both public and private macOS builds, in order to implement some configuration changes needed to ensure our builds are running reliably. During this time you may also see your macOS jobs be requeued at certain times. We do not currently have an ETA for when we'll return to full capacity but we're working to restore it as soon as possible. Thank you for your patience.

Travis-CI
Unplanned: Reduced capacity for public and private macOS builds

Oct 10, 18:51 UTC Identified - We are taking some unplanned period of reduced capacity for both public and private macOS builds, in order to implement some configuration changes needed to ensure our builds are running reliably. During this time you may also see your macOS jobs be requeued at certain times. We do not currently have an ETA for when we'll return to full capacity but we're working to restore it as soon as possible. Thank you for your patience.

Travis-CI
Increased wait times for jobs and slow web UI on travis-ci.com

Oct 5, 15:26 UTC Resolved - Builds are running as expected and database performance has stabilised on travis-ci.com.Oct 5, 14:52 UTC Monitoring - The backlog of builds on travis-ci.com has cleared and the API and web UI are back to normal operation. We continue to monitor the situation.Oct 5, 14:04 UTC Update - Users are experiencing increased wait times for builds on travis-ci.com. We continue to investigate the issue and will post a further update within one hour.Oct 5, 13:40 UTC Investigating - We are investigating high database load and query timeouts on travis-ci.com.

Travis-CI
Increased wait times for jobs and slow web UI on travis-ci.com

Oct 5, 14:52 UTC Monitoring - The backlog of builds on travis-ci.com has cleared and the API and web UI are back to normal operation. We continue to monitor the situation.Oct 5, 14:04 UTC Update - Users are experiencing increased wait times for builds on travis-ci.com. We continue to investigate the issue and will post a further update within one hour.Oct 5, 13:40 UTC Investigating - We are investigating high database load and query timeouts on travis-ci.com.

Travis-CI
Increased wait times for jobs and slow web UI on travis-ci.com

Oct 5, 14:04 UTC Update - Users are experiencing increased wait times for builds on travis-ci.com. We continue to investigate the issue and will post a further update within one hour.Oct 5, 13:40 UTC Investigating - We are investigating high database load and query timeouts on travis-ci.com.

Travis-CI
Increased wait times for jobs and slow web UI on travis-ci.com

Oct 5, 13:40 UTC Investigating - We are investigating high database load and query timeouts on travis-ci.com.

Travis-CI
Build statuses aren't updating on GitHub

Oct 4, 20:08 UTC Resolved - We haven't received other reports of this situation happening today but we are still communicating with GitHub to try to understand why this happened yesterday. To better reflect the status of this incident, we will be closing it for now. Thank you for your patience.Oct 4, 09:47 UTC Update - The number of errors posting GitHub Status Updates has decreased over the last hours. We’re closely investigating with GitHub to get to the bottom of this issue.Oct 3, 20:05 UTC Update - We are still trying to understand why this is happening. We have reached to GitHub to see if they can help us with troubleshooting this issue. Be assured that we will keep you posted on any new development on that front. Thank you for hanging in there with us.Oct 3, 18:59 UTC Investigating - We are currently seeing build statuses not posted successfully on GitHub. We are currently trying to find the root cause of this issue. We'll update here when we know more. Thank you for your patience!

Travis-CI
Build statuses aren't updating on GitHub

Oct 4, 09:47 UTC Update - The number of errors posting GitHub Status Updates has decreased over the last hours. We’re closely investigating with GitHub to get to the bottom of this issue.Oct 3, 20:05 UTC Update - We are still trying to understand why this is happening. We have reached to GitHub to see if they can help us with troubleshooting this issue. Be assured that we will keep you posted on any new development on that front. Thank you for hanging in there with us.Oct 3, 18:59 UTC Investigating - We are currently seeing build statuses not posted successfully on GitHub. We are currently trying to find the root cause of this issue. We'll update here when we know more. Thank you for your patience!

Travis-CI
Usage and backlog spike detected for `sudo: required` builds

Oct 3, 20:56 UTC Resolved - There is no longer any backlog for GCE users. We are still investigating requeuing internally and determining the root cause of this issue, but end users should no longer be affected.Oct 3, 19:27 UTC Monitoring - The backlogs have been cleared, but we continue to keep an eye on the situation to ensure all continues as intended.Oct 3, 18:25 UTC Identified - We continue to work to lower the queues and we are seeing a decrease in backlogs. Our .org backlog appears cleared but we are still seeing some on .com. We will update once the issue is resolved or any new information is discovered.Oct 3, 16:41 UTC Update - We’ve identified that internal requeues are the source of delays for sudo: required projects. We are continuing to investigate and resolve this issue and will continue posting updates here.Oct 3, 16:21 UTC Investigating - We are experiencing an issue caused by a sudden spike of usage on GCP. We are investigating the details and will provide updates as soon as we have them. The previous issue regarding status updates delays has been cleared.

Travis-CI
Build statuses aren't updating on GitHub

Oct 3, 20:05 UTC Update - We are still trying to understand why this is happening. We have reached to GitHub to see if they can help us with troubleshooting this issue. Be assured that we will keep you posted on any new development on that front. Thank you for hanging in there with us.Oct 3, 18:59 UTC Investigating - We are currently seeing build statuses not posted successfully on GitHub. We are currently trying to find the root cause of this issue. We'll update here when we know more. Thank you for your patience!

Travis-CI
Usage and backlog spike detected for `sudo: required` builds

Oct 3, 19:27 UTC Monitoring - The backlogs have been cleared, but we continue to keep an eye on the situation to ensure all continues as intended.Oct 3, 18:25 UTC Identified - We continue to work to lower the queues and we are seeing a decrease in backlogs. Our .org backlog appears cleared but we are still seeing some on .com. We will update once the issue is resolved or any new information is discovered.Oct 3, 16:41 UTC Update - We’ve identified that internal requeues are the source of delays for sudo: required projects. We are continuing to investigate and resolve this issue and will continue posting updates here.Oct 3, 16:21 UTC Investigating - We are experiencing an issue caused by a sudden spike of usage on GCP. We are investigating the details and will provide updates as soon as we have them. The previous issue regarding status updates delays has been cleared.

Travis-CI
Build statuses aren't updating on GitHub

Oct 3, 18:59 UTC Investigating - We are currently seeing build statuses not posted successfully on GitHub. We are currently trying to find the root cause of this issue. We'll update here when we know more. Thank you for your patience!

Travis-CI
Usage and backlog spike detected for `sudo: required` builds

Oct 3, 18:25 UTC Identified - We continue to work to lower the queues and we are seeing a decrease in backlogs. Our .org backlog appears cleared but we are still seeing some on .com. We will update once the issue is resolved or any new information is discovered.Oct 3, 16:41 UTC Update - We’ve identified that internal requeues are the source of delays for sudo: required projects. We are continuing to investigate and resolve this issue and will continue posting updates here.Oct 3, 16:21 UTC Investigating - We are experiencing an issue caused by a sudden spike of usage on GCP. We are investigating the details and will provide updates as soon as we have them. The previous issue regarding status updates delays has been cleared.

Travis-CI
Usage and backlog spike detected for `sudo: required` builds

Oct 3, 16:41 UTC Update - We’ve identified that internal requeues are the source of delays for sudo: required projects. We are continuing to investigate and resolve this issue and will continue posting updates here.Oct 3, 16:21 UTC Investigating - We are experiencing an issue caused by a sudden spike of usage on GCP. We are investigating the details and will provide updates as soon as we have them. The previous issue regarding status updates delays has been cleared.

Travis-CI
Usage and backlog spike detected for `sudo: required` builds

Oct 3, 16:21 UTC Investigating - We are experiencing an issue caused by a sudden spike of usage on GCP. We are investigating the details and will provide updates as soon as we have them. The previous issue regarding status updates delays has been cleared.

Travis-CI
Usage and backlog spike detected

Oct 3, 16:21 UTC Investigating - We are experiencing an issue caused by a sudden spike of usage on GCP. We are investigating the details and will provide updates as soon as we have them. The previous issue regarding status updates delays has been cleared.

Travis-CI
Job Status Update Delays

Oct 3, 16:20 UTC Resolved - We are no longer experiencing a delay in status updates.Oct 3, 15:37 UTC Investigating - Job status processing (ie, a job going from one stage to another, like from "queued" to "running") is currently delayed, causing some builds to take longer to run and results longer to propagate. We have scaled up our processing power, and the backlog started to clear, and are investigating what is going on. This affects both travis-ci.com and travis-ci.org.

Travis-CI
Job Status Update Delays

Oct 3, 15:37 UTC Investigating - Job status processing (ie, a job going from one stage to another, like from "queued" to "running") is currently delayed, causing some builds to take longer to run and results longer to propagate. We have scaled up our processing power, and the backlog started to clear, and are investigating what is going on. This affects both travis-ci.com and travis-ci.org.

Travis-CI
Deployment issues due to missing gem

Oct 2, 21:58 UTC Resolved - The issue has been resolved and the gem is no longer missing.Oct 2, 21:14 UTC Identified - We are currently experiencing an issue with deployment due to a missing gem. The issue is currently under investigation and we will post updates as soon as they are available.

Travis-CI
Deployment issues due to missing gem

Oct 2, 21:14 UTC Identified - We are currently experiencing an issue with deployment due to a missing gem. The issue is currently under investigation and we will post updates as soon as they are available.

Travis-CI
Backlog on sudo-enabled Linux for public repositories

Sep 25, 20:31 UTC Resolved - This incident has been resolved.Sep 25, 14:29 UTC Investigating - We are currently investigating a backlog on sudo-enabled Linux for public repositories.

Travis-CI
Backlog on sudo-enabled Linux for public repositories

Sep 25, 14:29 UTC Investigating - We are currently investigating a backlog on sudo-enabled Linux for public repositories.

Travis-CI
private macOS builds delays

Sep 22, 10:58 UTC Resolved - We have worked through the accumulated backlog for private macOS builds which are now performing normally. We are continuing to monitor this closely. Thanks for your patience!Sep 22, 09:42 UTC Monitoring - In an effort to stabilize our macOS infrastructure, yesterday at 17:00 UTC we also reduced the capacity available for macOS private repositories. We've now been able to address this and since 08:40 UTC , macOS Private builds are now running as expected. Thank you for your understanding.

Travis-CI
public mac OS builds running at travis-ci.org experiencing a high rate of errors

Sep 22, 10:03 UTC Resolved - We've worked through the accumulated backlog for public macOS builds which are now performing normally. We are continuing to monitor this closely. Thanks for your patience!Sep 21, 09:00 UTC Update - We have worked through most of the backlog over the last few hours. We continue to investigate the root cause for instability and we’ll post an update as soon as we can.Sep 20, 15:44 UTC Update - We've completed the first round of job cancellations and are seeing some improvements. We're currently evaluating other changes to help reduce the backlog and improve the wait time for jobs starting. We'll provide more updates as we learn more. Thank you for your patience.Sep 20, 13:47 UTC Update - In an effort to clear up load on macOS servers, we are cancelling builds that have been waiting to start for more than 6 hours. This process will start at 16:15 CEST and is expected to take approximately 2 hours to complete. We will also be rolling back to a previous version of the worker to determine if recent changes have contributed to these issues.Sep 20, 10:49 UTC Update - We continue to battle with stability issues on the MacOS platform, which is the root cause of severe wait times for public repository builds. We will keep you posted about any updates. We apologize for the delays this is causing.Sep 20, 01:33 UTC Update - We have not made any significant gains in understanding the source of instability, but changes to available capacity distribution are helping to reduce the impact of disconnections.Sep 19, 17:41 UTC Update - We are continuing to investigate the heightened AMQP timeout errors. The backlog for public OSX / mac OS jobs remains high.Sep 19, 10:50 UTC Investigating - We’re investigating a higher-than-normal AMQP timeout errors affecting the throughput of our builds. This is affecting public OSX / mac OS jobs most at the moment, as this is the highest demand queue.

Travis-CI
private macOS builds delays

Sep 22, 09:42 UTC Monitoring - In an effort to stabilize our macOS infrastructure, yesterday at 17:00 UTC we also reduced the capacity available for macOS private repositories. We've now been able to address this and since 08:40 UTC , macOS Private builds are now running as expected. Thank you for your understanding.

Travis-CI
Sudo-enabled private repo backlog

Sep 21, 15:52 UTC Resolved - This incident has been resolved.Sep 21, 15:28 UTC Monitoring - We have incurred a backlog on our sudo-enabled private repository queue while performing a graceful restart. We expect the backlog to clear once the full capacity is back online after finishing the longest-running jobs.

Travis-CI
Sudo-enabled private repo backlog

Sep 21, 15:28 UTC Monitoring - We have incurred a backlog on our sudo-enabled private repository queue while performing a graceful restart. We expect the backlog to clear once the full capacity is back online after finishing the longest-running jobs.

Travis-CI
public mac OS builds running at travis-ci.org experiencing a high rate of errors

Sep 21, 09:00 UTC Update - We have worked through most of the backlog over the last few hours. We continue to investigate the root cause for instability and we’ll post an update as soon as we can.Sep 20, 15:44 UTC Update - We've completed the first round of job cancellations and are seeing some improvements. We're currently evaluating other changes to help reduce the backlog and improve the wait time for jobs starting. We'll provide more updates as we learn more. Thank you for your patience.Sep 20, 13:47 UTC Update - In an effort to clear up load on macOS servers, we are cancelling builds that have been waiting to start for more than 6 hours. This process will start at 16:15 CEST and is expected to take approximately 2 hours to complete. We will also be rolling back to a previous version of the worker to determine if recent changes have contributed to these issues.Sep 20, 10:49 UTC Update - We continue to battle with stability issues on the MacOS platform, which is the root cause of severe wait times for public repository builds. We will keep you posted about any updates. We apologize for the delays this is causing.Sep 20, 01:33 UTC Update - We have not made any significant gains in understanding the source of instability, but changes to available capacity distribution are helping to reduce the impact of disconnections.Sep 19, 17:41 UTC Update - We are continuing to investigate the heightened AMQP timeout errors. The backlog for public OSX / mac OS jobs remains high.Sep 19, 10:50 UTC Investigating - We’re investigating a higher-than-normal AMQP timeout errors affecting the throughput of our builds. This is affecting public OSX / mac OS jobs most at the moment, as this is the highest demand queue.

Travis-CI
public mac OS builds running at travis-ci.org experiencing a high rate of errors

Sep 20, 15:44 UTC Update - We've completed the first round of job cancellations and are seeing some improvements. We're currently evaluating other changes to help reduce the backlog and improve the wait time for jobs starting. We'll provide more updates as we learn more. Thank you for your patience.Sep 20, 13:47 UTC Update - In an effort to clear up load on macOS servers, we are cancelling builds that have been waiting to start for more than 6 hours. This process will start at 16:15 CEST and is expected to take approximately 2 hours to complete. We will also be rolling back to a previous version of the worker to determine if recent changes have contributed to these issues.Sep 20, 10:49 UTC Update - We continue to battle with stability issues on the MacOS platform, which is the root cause of severe wait times for public repository builds. We will keep you posted about any updates. We apologize for the delays this is causing.Sep 20, 01:33 UTC Update - We have not made any significant gains in understanding the source of instability, but changes to available capacity distribution are helping to reduce the impact of disconnections.Sep 19, 17:41 UTC Update - We are continuing to investigate the heightened AMQP timeout errors. The backlog for public OSX / mac OS jobs remains high.Sep 19, 10:50 UTC Investigating - We’re investigating a higher-than-normal AMQP timeout errors affecting the throughput of our builds. This is affecting public OSX / mac OS jobs most at the moment, as this is the highest demand queue.

Travis-CI
public mac OS builds running at travis-ci.org experiencing a high rate of errors

Sep 20, 13:47 UTC Update - In an effort to clear up load on macOS servers, we are cancelling builds that have been waiting to start for more than 6 hours. This process will start at 16:15 CEST and is expected to take approximately 2 hours to complete. We will also be rolling back to a previous version of the worker to determine if recent changes have contributed to these issues.Sep 20, 10:49 UTC Update - We continue to battle with stability issues on the MacOS platform, which is the root cause of severe wait times for public repository builds. We will keep you posted about any updates. We apologize for the delays this is causing.Sep 20, 01:33 UTC Update - We have not made any significant gains in understanding the source of instability, but changes to available capacity distribution are helping to reduce the impact of disconnections.Sep 19, 17:41 UTC Update - We are continuing to investigate the heightened AMQP timeout errors. The backlog for public OSX / mac OS jobs remains high.Sep 19, 10:50 UTC Investigating - We’re investigating a higher-than-normal AMQP timeout errors affecting the throughput of our builds. This is affecting public OSX / mac OS jobs most at the moment, as this is the highest demand queue.

Travis-CI
public mac OS builds running at travis-ci.org experiencing a high rate of errors

Sep 20, 10:49 UTC Update - We continue to battle with stability issues on the MacOS platform, which is the root cause of severe wait times for public repository builds. We will keep you posted about any updates. We apologize for the delays this is causing.Sep 20, 01:33 UTC Update - We have not made any significant gains in understanding the source of instability, but changes to available capacity distribution are helping to reduce the impact of disconnections.Sep 19, 17:41 UTC Update - We are continuing to investigate the heightened AMQP timeout errors. The backlog for public OSX / mac OS jobs remains high.Sep 19, 10:50 UTC Investigating - We’re investigating a higher-than-normal AMQP timeout errors affecting the throughput of our builds. This is affecting public OSX / mac OS jobs most at the moment, as this is the highest demand queue.

Travis-CI
public mac OS builds running at travis-ci.org experiencing a high rate of errors

Sep 20, 01:33 UTC Update - We have not made any significant gains in understanding the source of instability, but changes to available capacity distribution are helping to reduce the impact of disconnections.Sep 19, 17:41 UTC Update - We are continuing to investigate the heightened AMQP timeout errors. The backlog for public OSX / mac OS jobs remains high.Sep 19, 10:50 UTC Investigating - We’re investigating a higher-than-normal AMQP timeout errors affecting the throughput of our builds. This is affecting public OSX / mac OS jobs most at the moment, as this is the highest demand queue.

Travis-CI
public mac OS builds running at travis-ci.org experiencing a high rate of errors

Sep 19, 17:41 UTC Update - We are continuing to investigate the heightened AMQP timeout errors. The backlog for public OSX / mac OS jobs remains high.Sep 19, 10:50 UTC Investigating - We’re investigating a higher-than-normal AMQP timeout errors affecting the throughput of our builds. This is affecting public OSX / mac OS jobs most at the moment, as this is the highest demand queue.

Travis-CI
public mac OS builds running at travis-ci.org experiencing a high rate of errors

Sep 19, 10:50 UTC Investigating - We’re investigating a higher-than-normal AMQP timeout errors affecting the throughput of our builds. This is affecting public OSX / mac OS jobs most at the moment, as this is the highest demand queue.

Travis-CI
Build delays for Linux `sudo: required` Open Source projects running at travis-ci.org

Sep 18, 21:31 UTC Resolved - This incident has been resolved.Sep 18, 20:56 UTC Monitoring - We can see that the backlog for sudo-enabled builds running on GCE has cleared. We are continuing to roll out the fix to our other infrastructures.Sep 18, 20:03 UTC Update - We've identified that a new backend change was having a unexpected negative impact on communications with a message queue and was leading to an increased backlog. We've test out a configuration change to disable this new change and it's having the positive impact we expected, so we're continue to roll it out to all parts of our infrastructure. We'll provide updates as this rollout progress.Sep 18, 17:17 UTC Investigating - We’re investigating an anomaly in demand that is causing backlog on public Linux `sudo: required` builds running at travis-ci.org

Travis-CI
Build delays for Linux `sudo: required` Open Source projects running at travis-ci.org

Sep 18, 20:56 UTC Monitoring - We can see that the backlog for sudo-enabled builds running on GCE has cleared. We are continuing to roll out the fix to our other infrastructures.Sep 18, 20:03 UTC Update - We've identified that a new backend change was having a unexpected negative impact on communications with a message queue and was leading to an increased backlog. We've test out a configuration change to disable this new change and it's having the positive impact we expected, so we're continue to roll it out to all parts of our infrastructure. We'll provide updates as this rollout progress.Sep 18, 17:17 UTC Investigating - We’re investigating an anomaly in demand that is causing backlog on public Linux `sudo: required` builds running at travis-ci.org

Travis-CI
Build delays for Linux `sudo: required` Open Source projects running at travis-ci.org

Sep 18, 20:03 UTC Update - We've identified that a new backend change was having a unexpected negative impact on communications with a message queue and was leading to an increased backlog. We've test out a configuration change to disable this new change and it's having the positive impact we expected, so we're continue to roll it out to all parts of our infrastructure. We'll provide updates as this rollout progress.Sep 18, 17:17 UTC Investigating - We’re investigating an anomaly in demand that is causing backlog on public Linux `sudo: required` builds running at travis-ci.org

Travis-CI
Build delays for Linux `sudo: required` Open Source projects running at travis-ci.org

Sep 18, 17:17 UTC Investigating - We’re investigating an anomaly in demand that is causing backlog on public Linux `sudo: required` builds running at travis-ci.org

Travis-CI
Build delays for Linux Open Source projects running at travis-ci.org

Sep 18, 17:17 UTC Investigating - We’re investigating an anomaly in demand that is causing backlog on public Linux builds running at travis-ci.org

Travis-CI
[Scheduled] Database upgrade on travis-ci.org and travis-ci.com

Sep 17, 10:50 UTC Completed - The maintenance is complete, thanks for bearing with us! 💛Sep 17, 10:02 UTC Update - Maintenance of travis-ci.com is complete and we continuing processing of jobs. We are now beginning maintenance on travis-ci.org.Sep 17, 08:58 UTC In progress - We are beginning our scheduled maintenance on travis-ci.com and travis-ci.org.Sep 11, 16:19 UTC Scheduled - We are performing some scheduled maintenance on travis-ci.org and travis-ci.com on Sunday, 18 September, 2017 From 09.00 AM UTC to 12:00 PM UTC . We expect the API and web interface to be unavailable for some of that time window on both platforms. Processing of public and private builds is also expected to be delayed.

Travis-CI
[Scheduled] Database upgrade on travis-ci.org and travis-ci.com

Sep 17, 10:02 UTC Update - Maintenance of travis-ci.com is complete and we continuing processing of jobs. We are now beginning maintenance on travis-ci.org.Sep 17, 08:58 UTC In progress - We are beginning our scheduled maintenance on travis-ci.com and travis-ci.org.Sep 11, 16:19 UTC Scheduled - We are performing some scheduled maintenance on travis-ci.org and travis-ci.com on Sunday, 18 September, 2017 From 09.00 AM UTC to 12:00 PM UTC . We expect the API and web interface to be unavailable for some of that time window on both platforms. Processing of public and private builds is also expected to be delayed.

Travis-CI
[Scheduled] Database upgrade on travis-ci.org and travis-ci.com

Sep 11, 16:19 UTC Scheduled - We are performing some scheduled maintenance on travis-ci.org and travis-ci.com on Sunday, 18 September, 2017 From 09.00 AM UTC to 12:00 PM UTC . We expect the API and web interface to be unavailable for some of that time window on both platforms. Processing of public and private builds is also expected to be delayed.

Travis-CI
[Scheduled] Database upgrade on travis-ci.org and travis-ci.com

Sep 17, 08:58 UTC In progress - We are beginning our scheduled maintenance on travis-ci.com and travis-ci.org.Sep 11, 16:19 UTC Scheduled - We are performing some scheduled maintenance on travis-ci.org and travis-ci.com on Sunday, 18 September, 2017 From 09.00 AM UTC to 12:00 PM UTC . We expect the API and web interface to be unavailable for some of that time window on both platforms. Processing of public and private builds is also expected to be delayed.

Travis-CI
AWS S3 us-east-1 issues affecting build caching, artifacts, and build logs

Sep 14, 20:34 UTC Resolved - AWS S3 issues have been resolved and all services are operating normally.Sep 14, 19:08 UTC Identified - AWS is reporting issues with S3 in us-east-1, 11:58 AM PDT We are investigating increased error rates for Amazon S3 requests in the US-EAST-1 Region. WE can confirm we're seeing these issues as well. While S3 is unstable you'll see errors from build caching/artifacts activities and may have trouble accessing older build logs, which are stored in S3 long term. We will provide updates as we learn more.

Travis-CI
Build delays for private builds/travis-ci.com caused by the previous API issue

Sep 14, 20:08 UTC Resolved - A tiny backlog remains for private Mac builds but it should be cleared in the next 30 minutes. Hence, we are resolving this incident for now. Thank you for your enduring patience!Sep 14, 18:54 UTC Update - The backlog for container-based (i.e. sudo: false) Linux builds has cleared. A backlog remains for Mac builds and we will update you when it's cleared. Thank you!Sep 14, 18:39 UTC Update - We are happy to report that the backlog has cleared for sudo-enabled Linux builds. Small backlogs remain for container-based Linux and Mac builds.Sep 14, 18:23 UTC Monitoring - We are sorry to inform you that the previous incident (https://www.traviscistatus.com/incidents/4gy46v0t3vrq), although it's fixed, resulted in a backlog for private builds. Hence you might experience some delays with your builds. Sorry for the inconvenience. We are monitoring things closely and we will update with the state of the backlog on our different infrastructures in a timely manner. Thank you for your patience!

Travis-CI
AWS S3 us-east-1 issues affecting build caching, artifacts, and build logs

Sep 14, 19:08 UTC Identified - AWS is reporting issues with S3 in us-east-1, 11:58 AM PDT We are investigating increased error rates for Amazon S3 requests in the US-EAST-1 Region. WE can confirm we're seeing these issues as well. While S3 is unstable you'll see errors from build caching/artifacts activities and may have trouble accessing older build logs, which are stored in S3 long term. We will provide updates as we learn more.

Travis-CI
AWS S3 us-east-1 issues affecting build caching, artificats, and build logs

Sep 14, 19:08 UTC Identified - AWS is reporting issues with S3 in us-east-1, 11:58 AM PDT We are investigating increased error rates for Amazon S3 requests in the US-EAST-1 Region. WE can confirm we're seeing these issues as well. While S3 is unstable you'll see errors from build caching/artifacts activities and may have trouble accessing older build logs, which are stored in S3 long term. We will provide updates as we learn more.

Travis-CI
Build delays for private builds/travis-ci.com caused by the previous API issue

Sep 14, 18:54 UTC Update - The backlog for container-based (i.e. sudo: false) Linux builds has cleared. A backlog remains for Mac builds and we will update you when it's cleared. Thank you!Sep 14, 18:39 UTC Update - We are happy to report that the backlog has cleared for sudo-enabled Linux builds. Small backlogs remain for container-based Linux and Mac builds.Sep 14, 18:23 UTC Monitoring - We are sorry to inform you that the previous incident (https://www.traviscistatus.com/incidents/4gy46v0t3vrq), although it's fixed, resulted in a backlog for private builds. Hence you might experience some delays with your builds. Sorry for the inconvenience. We are monitoring things closely and we will update with the state of the backlog on our different infrastructures in a timely manner. Thank you for your patience!

Travis-CI
Build delays for private builds/travis-ci.com caused by the previous API issue

Sep 14, 18:39 UTC Update - We are happy to report that the backlog has cleared for sudo-enabled Linux builds. Small backlogs remain for container-based Linux and Mac builds.Sep 14, 18:23 UTC Monitoring - We are sorry to inform you that the previous incident (https://www.traviscistatus.com/incidents/4gy46v0t3vrq), although it's fixed, resulted in a backlog for private builds. Hence you might experience some delays with your builds. Sorry for the inconvenience. We are monitoring things closely and we will update with the state of the backlog on our different infrastructures in a timely manner. Thank you for your patience!

Travis-CI
Build delays for private builds/travis-ci.com caused by the previous API issue

Sep 14, 18:23 UTC Monitoring - We are sorry to inform you that the previous incident (https://www.traviscistatus.com/incidents/4gy46v0t3vrq), although it's fixed, resulted in a backlog for private builds. Hence you might experience some delays with your builds. Sorry for the inconvenience. We are monitoring things closely and we will update with the state of the backlog on our different infrastructures in a timely manner. Thank you for your patience!

Travis-CI
Travis API for .com Private Builds

Sep 14, 17:36 UTC Resolved - Travis API for .com private builds is back up. Our redis service had to be migrated due to a hardware failure on AWS in the us-east-1 region.Sep 14, 17:21 UTC Identified - Our API for .com builds is down due to our Redis instances being unavailable. We are working with our 3rd party redis hosting service.

Travis-CI
Travis API for .com Private Builds

Sep 14, 17:21 UTC Identified - Our API for .com builds is down due to our Redis instances being unavailable. We are working with our 3rd party redis hosting service.

Travis-CI
Delays in log processing on private builds

Sep 13, 19:06 UTC Resolved - Log processes have stabilized for private builds on travis-ci.comSep 13, 15:30 UTC Investigating - We are investigating delays in log processing for private builds on travis-ci.com. Log streaming is unaffected.

Travis-CI
Delays in log processing on private builds

Sep 13, 15:30 UTC Investigating - We are investigating delays in log processing for private builds on travis-ci.com. Log streaming is unaffected.

Travis-CI
Increased error rates on macOS builds

Sep 11, 21:11 UTC Resolved - The public macOS build backlog has reached normal peak levels and things are remaining stable. We're closing the incident at this time. A postmortem blog post will be published in the next few days and we'll share it on Twitter when it's published. Thank you everyone for your patience and understanding during this extended incident.Sep 11, 19:52 UTC Update - The backlog has cleared for private builds. We are continuing to monitor the situation for public/open source builds. Thanks for hanging in there with us.Sep 11, 16:50 UTC Update - We're resuming full private macOS build capacity.Sep 11, 15:47 UTC Update - We're seeing some instability with some of the private macOS build capacity and so we're reducing capacity temporarily.Sep 11, 03:57 UTC Monitoring - We've resumed full build capacity for public builds. We will be monitoring things overnight and will provide further updates in the morning PDT . Thank you for your patience.Sep 11, 03:45 UTC Update - We've completed the first phase of our SAN cleanup. Things are stable and so we're working to resume full public macOS build capacity. We'll provide another update when that's complete.Sep 10, 19:56 UTC Update - We're now running with the previous capacity for public builds, which is still reduced from our "normal" capacity. We are continuing with SAN cleanup. We'll provide updates as things progress today. Thank you for your patience.Sep 10, 19:45 UTC Update - We temporarily have additional reduced capacity for public builds, as we take some actions to continue with our SAN cleanup. We'll provide another update when that capacity has been restored.Sep 10, 16:45 UTC Update - We've processed a backlog of approximately 9,600 macOS jobs for public repositories since re-enabling public macOS builds at 07:00 PDT yesterday. As we're still at reduced capacity and working on cleaning the SAN, we still have a backlog of ~150-200 jobs and continue to actively process them. We'll provide updates as things progress today. Thank you for your patience.Sep 9, 15:11 UTC Update - We're continuing to process the public backlog while running SAN cleanup. We may still need to reduce or suspend public builds later in the weekend, depending on SAN progress. Thank you for your patience.Sep 9, 14:32 UTC Update - Capacity for macOS public repositories has been back online for ~1 hr. We're bumping additional capacity to work through the backlog.Sep 9, 13:04 UTC Update - The backlog for private repository builds has been clear for ~4h. We are planning to bring partial capacity for public repositories back online shortly.Sep 9, 03:13 UTC Update - We've resumed running private builds at this time. We'll provide further updates on the overall progress tomorrow morning PDT . Thank you for your patience.Sep 9, 01:28 UTC Update - We ran into an issue with booting Xcode 8.x images, so all builds are suspended again. We'll update when private builds are running.Sep 9, 01:04 UTC Update - In order to help things become stable and reliable going forward, we're undertaking intense cleanup of our SAN filesystem. This cleanup is likely to take all weekend. Because of this, we're only able to resume a portion of our capacity for private builds and will not be resuming shared public builds yet. We do not currently have an ETA for when we'll be able to resume shared public builds. We will provide our next update in the morning PDT . We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 21:29 UTC Update - We're working on stabilization cleanup for our SAN storage. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 19:52 UTC Update - We're continuing to work on getting things into a stable state where we can potentially start running builds. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 16:52 UTC Update - We've rebooted our vCenters and continue to work on stabilizing things. All macOS builds remain stopped thank you for patience.Sep 8, 15:01 UTC Update - macOS jobs for public and private repositories builds are stopped. We're currently working with our infrastructure provider to reboot one of our vCenter instances to work out unresponsive SAN issues.Sep 8, 12:34 UTC Update - We are stopping all macOS jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 11, 19:52 UTC Update - The backlog has cleared for private builds. We are continuing to monitor the situation for public/open source builds. Thanks for hanging in there with us.Sep 11, 16:50 UTC Update - We're resuming full private macOS build capacity.Sep 11, 15:47 UTC Update - We're seeing some instability with some of the private macOS build capacity and so we're reducing capacity temporarily.Sep 11, 03:57 UTC Monitoring - We've resumed full build capacity for public builds. We will be monitoring things overnight and will provide further updates in the morning PDT . Thank you for your patience.Sep 11, 03:45 UTC Update - We've completed the first phase of our SAN cleanup. Things are stable and so we're working to resume full public macOS build capacity. We'll provide another update when that's complete.Sep 10, 19:56 UTC Update - We're now running with the previous capacity for public builds, which is still reduced from our "normal" capacity. We are continuing with SAN cleanup. We'll provide updates as things progress today. Thank you for your patience.Sep 10, 19:45 UTC Update - We temporarily have additional reduced capacity for public builds, as we take some actions to continue with our SAN cleanup. We'll provide another update when that capacity has been restored.Sep 10, 16:45 UTC Update - We've processed a backlog of approximately 9,600 macOS jobs for public repositories since re-enabling public macOS builds at 07:00 PDT yesterday. As we're still at reduced capacity and working on cleaning the SAN, we still have a backlog of ~150-200 jobs and continue to actively process them. We'll provide updates as things progress today. Thank you for your patience.Sep 9, 15:11 UTC Update - We're continuing to process the public backlog while running SAN cleanup. We may still need to reduce or suspend public builds later in the weekend, depending on SAN progress. Thank you for your patience.Sep 9, 14:32 UTC Update - Capacity for macOS public repositories has been back online for ~1 hr. We're bumping additional capacity to work through the backlog.Sep 9, 13:04 UTC Update - The backlog for private repository builds has been clear for ~4h. We are planning to bring partial capacity for public repositories back online shortly.Sep 9, 03:13 UTC Update - We've resumed running private builds at this time. We'll provide further updates on the overall progress tomorrow morning PDT . Thank you for your patience.Sep 9, 01:28 UTC Update - We ran into an issue with booting Xcode 8.x images, so all builds are suspended again. We'll update when private builds are running.Sep 9, 01:04 UTC Update - In order to help things become stable and reliable going forward, we're undertaking intense cleanup of our SAN filesystem. This cleanup is likely to take all weekend. Because of this, we're only able to resume a portion of our capacity for private builds and will not be resuming shared public builds yet. We do not currently have an ETA for when we'll be able to resume shared public builds. We will provide our next update in the morning PDT . We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 21:29 UTC Update - We're working on stabilization cleanup for our SAN storage. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 19:52 UTC Update - We're continuing to work on getting things into a stable state where we can potentially start running builds. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 16:52 UTC Update - We've rebooted our vCenters and continue to work on stabilizing things. All macOS builds remain stopped thank you for patience.Sep 8, 15:01 UTC Update - macOS jobs for public and private repositories builds are stopped. We're currently working with our infrastructure provider to reboot one of our vCenter instances to work out unresponsive SAN issues.Sep 8, 12:34 UTC Update - We are stopping all macOS jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 11, 16:50 UTC Update - We're resuming full private macOS build capacity.Sep 11, 15:47 UTC Update - We're seeing some instability with some of the private macOS build capacity and so we're reducing capacity temporarily.Sep 11, 03:57 UTC Monitoring - We've resumed full build capacity for public builds. We will be monitoring things overnight and will provide further updates in the morning PDT . Thank you for your patience.Sep 11, 03:45 UTC Update - We've completed the first phase of our SAN cleanup. Things are stable and so we're working to resume full public macOS build capacity. We'll provide another update when that's complete.Sep 10, 19:56 UTC Update - We're now running with the previous capacity for public builds, which is still reduced from our "normal" capacity. We are continuing with SAN cleanup. We'll provide updates as things progress today. Thank you for your patience.Sep 10, 19:45 UTC Update - We temporarily have additional reduced capacity for public builds, as we take some actions to continue with our SAN cleanup. We'll provide another update when that capacity has been restored.Sep 10, 16:45 UTC Update - We've processed a backlog of approximately 9,600 macOS jobs for public repositories since re-enabling public macOS builds at 07:00 PDT yesterday. As we're still at reduced capacity and working on cleaning the SAN, we still have a backlog of ~150-200 jobs and continue to actively process them. We'll provide updates as things progress today. Thank you for your patience.Sep 9, 15:11 UTC Update - We're continuing to process the public backlog while running SAN cleanup. We may still need to reduce or suspend public builds later in the weekend, depending on SAN progress. Thank you for your patience.Sep 9, 14:32 UTC Update - Capacity for macOS public repositories has been back online for ~1 hr. We're bumping additional capacity to work through the backlog.Sep 9, 13:04 UTC Update - The backlog for private repository builds has been clear for ~4h. We are planning to bring partial capacity for public repositories back online shortly.Sep 9, 03:13 UTC Update - We've resumed running private builds at this time. We'll provide further updates on the overall progress tomorrow morning PDT . Thank you for your patience.Sep 9, 01:28 UTC Update - We ran into an issue with booting Xcode 8.x images, so all builds are suspended again. We'll update when private builds are running.Sep 9, 01:04 UTC Update - In order to help things become stable and reliable going forward, we're undertaking intense cleanup of our SAN filesystem. This cleanup is likely to take all weekend. Because of this, we're only able to resume a portion of our capacity for private builds and will not be resuming shared public builds yet. We do not currently have an ETA for when we'll be able to resume shared public builds. We will provide our next update in the morning PDT . We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 21:29 UTC Update - We're working on stabilization cleanup for our SAN storage. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 19:52 UTC Update - We're continuing to work on getting things into a stable state where we can potentially start running builds. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 16:52 UTC Update - We've rebooted our vCenters and continue to work on stabilizing things. All macOS builds remain stopped thank you for patience.Sep 8, 15:01 UTC Update - macOS jobs for public and private repositories builds are stopped. We're currently working with our infrastructure provider to reboot one of our vCenter instances to work out unresponsive SAN issues.Sep 8, 12:34 UTC Update - We are stopping all macOS jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 11, 15:47 UTC Update - We're seeing some instability with some of the private macOS build capacity and so we're reducing capacity temporarily.Sep 11, 03:57 UTC Monitoring - We've resumed full build capacity for public builds. We will be monitoring things overnight and will provide further updates in the morning PDT . Thank you for your patience.Sep 11, 03:45 UTC Update - We've completed the first phase of our SAN cleanup. Things are stable and so we're working to resume full public macOS build capacity. We'll provide another update when that's complete.Sep 10, 19:56 UTC Update - We're now running with the previous capacity for public builds, which is still reduced from our "normal" capacity. We are continuing with SAN cleanup. We'll provide updates as things progress today. Thank you for your patience.Sep 10, 19:45 UTC Update - We temporarily have additional reduced capacity for public builds, as we take some actions to continue with our SAN cleanup. We'll provide another update when that capacity has been restored.Sep 10, 16:45 UTC Update - We've processed a backlog of approximately 9,600 macOS jobs for public repositories since re-enabling public macOS builds at 07:00 PDT yesterday. As we're still at reduced capacity and working on cleaning the SAN, we still have a backlog of ~150-200 jobs and continue to actively process them. We'll provide updates as things progress today. Thank you for your patience.Sep 9, 15:11 UTC Update - We're continuing to process the public backlog while running SAN cleanup. We may still need to reduce or suspend public builds later in the weekend, depending on SAN progress. Thank you for your patience.Sep 9, 14:32 UTC Update - Capacity for macOS public repositories has been back online for ~1 hr. We're bumping additional capacity to work through the backlog.Sep 9, 13:04 UTC Update - The backlog for private repository builds has been clear for ~4h. We are planning to bring partial capacity for public repositories back online shortly.Sep 9, 03:13 UTC Update - We've resumed running private builds at this time. We'll provide further updates on the overall progress tomorrow morning PDT . Thank you for your patience.Sep 9, 01:28 UTC Update - We ran into an issue with booting Xcode 8.x images, so all builds are suspended again. We'll update when private builds are running.Sep 9, 01:04 UTC Update - In order to help things become stable and reliable going forward, we're undertaking intense cleanup of our SAN filesystem. This cleanup is likely to take all weekend. Because of this, we're only able to resume a portion of our capacity for private builds and will not be resuming shared public builds yet. We do not currently have an ETA for when we'll be able to resume shared public builds. We will provide our next update in the morning PDT . We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 21:29 UTC Update - We're working on stabilization cleanup for our SAN storage. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 19:52 UTC Update - We're continuing to work on getting things into a stable state where we can potentially start running builds. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 16:52 UTC Update - We've rebooted our vCenters and continue to work on stabilizing things. All macOS builds remain stopped thank you for patience.Sep 8, 15:01 UTC Update - macOS jobs for public and private repositories builds are stopped. We're currently working with our infrastructure provider to reboot one of our vCenter instances to work out unresponsive SAN issues.Sep 8, 12:34 UTC Update - We are stopping all macOS jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 11, 03:57 UTC Monitoring - We've resumed full build capacity for public builds. We will be monitoring things overnight and will provide further updates in the morning PDT . Thank you for your patience.Sep 11, 03:45 UTC Update - We've completed the first phase of our SAN cleanup. Things are stable and so we're working to resume full public macOS build capacity. We'll provide another update when that's complete.Sep 10, 19:56 UTC Update - We're now running with the previous capacity for public builds, which is still reduced from our "normal" capacity. We are continuing with SAN cleanup. We'll provide updates as things progress today. Thank you for your patience.Sep 10, 19:45 UTC Update - We temporarily have additional reduced capacity for public builds, as we take some actions to continue with our SAN cleanup. We'll provide another update when that capacity has been restored.Sep 10, 16:45 UTC Update - We've processed a backlog of approximately 9,600 macOS jobs for public repositories since re-enabling public macOS builds at 07:00 PDT yesterday. As we're still at reduced capacity and working on cleaning the SAN, we still have a backlog of ~150-200 jobs and continue to actively process them. We'll provide updates as things progress today. Thank you for your patience.Sep 9, 15:11 UTC Update - We're continuing to process the public backlog while running SAN cleanup. We may still need to reduce or suspend public builds later in the weekend, depending on SAN progress. Thank you for your patience.Sep 9, 14:32 UTC Update - Capacity for macOS public repositories has been back online for ~1 hr. We're bumping additional capacity to work through the backlog.Sep 9, 13:04 UTC Update - The backlog for private repository builds has been clear for ~4h. We are planning to bring partial capacity for public repositories back online shortly.Sep 9, 03:13 UTC Update - We've resumed running private builds at this time. We'll provide further updates on the overall progress tomorrow morning PDT . Thank you for your patience.Sep 9, 01:28 UTC Update - We ran into an issue with booting Xcode 8.x images, so all builds are suspended again. We'll update when private builds are running.Sep 9, 01:04 UTC Update - In order to help things become stable and reliable going forward, we're undertaking intense cleanup of our SAN filesystem. This cleanup is likely to take all weekend. Because of this, we're only able to resume a portion of our capacity for private builds and will not be resuming shared public builds yet. We do not currently have an ETA for when we'll be able to resume shared public builds. We will provide our next update in the morning PDT . We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 21:29 UTC Update - We're working on stabilization cleanup for our SAN storage. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 19:52 UTC Update - We're continuing to work on getting things into a stable state where we can potentially start running builds. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 16:52 UTC Update - We've rebooted our vCenters and continue to work on stabilizing things. All macOS builds remain stopped thank you for patience.Sep 8, 15:01 UTC Update - macOS jobs for public and private repositories builds are stopped. We're currently working with our infrastructure provider to reboot one of our vCenter instances to work out unresponsive SAN issues.Sep 8, 12:34 UTC Update - We are stopping all macOS jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 11, 03:45 UTC Update - We've completed the first phase of our SAN cleanup. Things are stable and so we're working to resume full public macOS build capacity. We'll provide another update when that's complete.Sep 10, 19:56 UTC Update - We're now running with the previous capacity for public builds, which is still reduced from our "normal" capacity. We are continuing with SAN cleanup. We'll provide updates as things progress today. Thank you for your patience.Sep 10, 19:45 UTC Update - We temporarily have additional reduced capacity for public builds, as we take some actions to continue with our SAN cleanup. We'll provide another update when that capacity has been restored.Sep 10, 16:45 UTC Update - We've processed a backlog of approximately 9,600 macOS jobs for public repositories since re-enabling public macOS builds at 07:00 PDT yesterday. As we're still at reduced capacity and working on cleaning the SAN, we still have a backlog of ~150-200 jobs and continue to actively process them. We'll provide updates as things progress today. Thank you for your patience.Sep 9, 15:11 UTC Update - We're continuing to process the public backlog while running SAN cleanup. We may still need to reduce or suspend public builds later in the weekend, depending on SAN progress. Thank you for your patience.Sep 9, 14:32 UTC Update - Capacity for macOS public repositories has been back online for ~1 hr. We're bumping additional capacity to work through the backlog.Sep 9, 13:04 UTC Update - The backlog for private repository builds has been clear for ~4h. We are planning to bring partial capacity for public repositories back online shortly.Sep 9, 03:13 UTC Update - We've resumed running private builds at this time. We'll provide further updates on the overall progress tomorrow morning PDT . Thank you for your patience.Sep 9, 01:28 UTC Update - We ran into an issue with booting Xcode 8.x images, so all builds are suspended again. We'll update when private builds are running.Sep 9, 01:04 UTC Update - In order to help things become stable and reliable going forward, we're undertaking intense cleanup of our SAN filesystem. This cleanup is likely to take all weekend. Because of this, we're only able to resume a portion of our capacity for private builds and will not be resuming shared public builds yet. We do not currently have an ETA for when we'll be able to resume shared public builds. We will provide our next update in the morning PDT . We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 21:29 UTC Update - We're working on stabilization cleanup for our SAN storage. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 19:52 UTC Update - We're continuing to work on getting things into a stable state where we can potentially start running builds. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 16:52 UTC Update - We've rebooted our vCenters and continue to work on stabilizing things. All macOS builds remain stopped thank you for patience.Sep 8, 15:01 UTC Update - macOS jobs for public and private repositories builds are stopped. We're currently working with our infrastructure provider to reboot one of our vCenter instances to work out unresponsive SAN issues.Sep 8, 12:34 UTC Update - We are stopping all macOS jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 10, 19:56 UTC Update - We're now running with the previous capacity for public builds, which is still reduced from our "normal" capacity. We are continuing with SAN cleanup. We'll provide updates as things progress today. Thank you for your patience.Sep 10, 19:45 UTC Update - We temporarily have additional reduced capacity for public builds, as we take some actions to continue with our SAN cleanup. We'll provide another update when that capacity has been restored.Sep 10, 16:45 UTC Update - We've processed a backlog of approximately 9,600 macOS jobs for public repositories since re-enabling public macOS builds at 07:00 PDT yesterday. As we're still at reduced capacity and working on cleaning the SAN, we still have a backlog of ~150-200 jobs and continue to actively process them. We'll provide updates as things progress today. Thank you for your patience.Sep 9, 15:11 UTC Update - We're continuing to process the public backlog while running SAN cleanup. We may still need to reduce or suspend public builds later in the weekend, depending on SAN progress. Thank you for your patience.Sep 9, 14:32 UTC Update - Capacity for macOS public repositories has been back online for ~1 hr. We're bumping additional capacity to work through the backlog.Sep 9, 13:04 UTC Update - The backlog for private repository builds has been clear for ~4h. We are planning to bring partial capacity for public repositories back online shortly.Sep 9, 03:13 UTC Update - We've resumed running private builds at this time. We'll provide further updates on the overall progress tomorrow morning PDT . Thank you for your patience.Sep 9, 01:28 UTC Update - We ran into an issue with booting Xcode 8.x images, so all builds are suspended again. We'll update when private builds are running.Sep 9, 01:04 UTC Update - In order to help things become stable and reliable going forward, we're undertaking intense cleanup of our SAN filesystem. This cleanup is likely to take all weekend. Because of this, we're only able to resume a portion of our capacity for private builds and will not be resuming shared public builds yet. We do not currently have an ETA for when we'll be able to resume shared public builds. We will provide our next update in the morning PDT . We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 21:29 UTC Update - We're working on stabilization cleanup for our SAN storage. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 19:52 UTC Update - We're continuing to work on getting things into a stable state where we can potentially start running builds. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 16:52 UTC Update - We've rebooted our vCenters and continue to work on stabilizing things. All macOS builds remain stopped thank you for patience.Sep 8, 15:01 UTC Update - macOS jobs for public and private repositories builds are stopped. We're currently working with our infrastructure provider to reboot one of our vCenter instances to work out unresponsive SAN issues.Sep 8, 12:34 UTC Update - We are stopping all macOS jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 10, 19:45 UTC Update - We temporarily have additional reduced capacity for public builds, as we take some actions to continue with our SAN cleanup. We'll provide another update when that capacity has been restored.Sep 10, 16:45 UTC Update - We've processed a backlog of approximately 9,600 macOS jobs for public repositories since re-enabling public macOS builds at 07:00 PDT yesterday. As we're still at reduced capacity and working on cleaning the SAN, we still have a backlog of ~150-200 jobs and continue to actively process them. We'll provide updates as things progress today. Thank you for your patience.Sep 9, 15:11 UTC Update - We're continuing to process the public backlog while running SAN cleanup. We may still need to reduce or suspend public builds later in the weekend, depending on SAN progress. Thank you for your patience.Sep 9, 14:32 UTC Update - Capacity for macOS public repositories has been back online for ~1 hr. We're bumping additional capacity to work through the backlog.Sep 9, 13:04 UTC Update - The backlog for private repository builds has been clear for ~4h. We are planning to bring partial capacity for public repositories back online shortly.Sep 9, 03:13 UTC Update - We've resumed running private builds at this time. We'll provide further updates on the overall progress tomorrow morning PDT . Thank you for your patience.Sep 9, 01:28 UTC Update - We ran into an issue with booting Xcode 8.x images, so all builds are suspended again. We'll update when private builds are running.Sep 9, 01:04 UTC Update - In order to help things become stable and reliable going forward, we're undertaking intense cleanup of our SAN filesystem. This cleanup is likely to take all weekend. Because of this, we're only able to resume a portion of our capacity for private builds and will not be resuming shared public builds yet. We do not currently have an ETA for when we'll be able to resume shared public builds. We will provide our next update in the morning PDT . We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 21:29 UTC Update - We're working on stabilization cleanup for our SAN storage. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 19:52 UTC Update - We're continuing to work on getting things into a stable state where we can potentially start running builds. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 16:52 UTC Update - We've rebooted our vCenters and continue to work on stabilizing things. All macOS builds remain stopped thank you for patience.Sep 8, 15:01 UTC Update - macOS jobs for public and private repositories builds are stopped. We're currently working with our infrastructure provider to reboot one of our vCenter instances to work out unresponsive SAN issues.Sep 8, 12:34 UTC Update - We are stopping all macOS jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 10, 16:45 UTC Update - We've processed a backlog of approximately 9,600 macOS jobs for public repositories since re-enabling public macOS builds at 07:00 PDT yesterday. As we're still at reduced capacity and working on cleaning the SAN, we still have a backlog of ~150-200 jobs and continue to actively process them. We'll provide updates as things progress today. Thank you for your patience.Sep 9, 15:11 UTC Update - We're continuing to process the public backlog while running SAN cleanup. We may still need to reduce or suspend public builds later in the weekend, depending on SAN progress. Thank you for your patience.Sep 9, 14:32 UTC Update - Capacity for macOS public repositories has been back online for ~1 hr. We're bumping additional capacity to work through the backlog.Sep 9, 13:04 UTC Update - The backlog for private repository builds has been clear for ~4h. We are planning to bring partial capacity for public repositories back online shortly.Sep 9, 03:13 UTC Update - We've resumed running private builds at this time. We'll provide further updates on the overall progress tomorrow morning PDT . Thank you for your patience.Sep 9, 01:28 UTC Update - We ran into an issue with booting Xcode 8.x images, so all builds are suspended again. We'll update when private builds are running.Sep 9, 01:04 UTC Update - In order to help things become stable and reliable going forward, we're undertaking intense cleanup of our SAN filesystem. This cleanup is likely to take all weekend. Because of this, we're only able to resume a portion of our capacity for private builds and will not be resuming shared public builds yet. We do not currently have an ETA for when we'll be able to resume shared public builds. We will provide our next update in the morning PDT . We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 21:29 UTC Update - We're working on stabilization cleanup for our SAN storage. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 19:52 UTC Update - We're continuing to work on getting things into a stable state where we can potentially start running builds. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 16:52 UTC Update - We've rebooted our vCenters and continue to work on stabilizing things. All macOS builds remain stopped thank you for patience.Sep 8, 15:01 UTC Update - macOS jobs for public and private repositories builds are stopped. We're currently working with our infrastructure provider to reboot one of our vCenter instances to work out unresponsive SAN issues.Sep 8, 12:34 UTC Update - We are stopping all macOS jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 9, 15:11 UTC Update - We're continuing to process the public backlog while running SAN cleanup. We may still need to reduce or suspend public builds later in the weekend, depending on SAN progress. Thank you for your patience.Sep 9, 14:32 UTC Update - Capacity for macOS public repositories has been back online for ~1 hr. We're bumping additional capacity to work through the backlog.Sep 9, 13:04 UTC Update - The backlog for private repository builds has been clear for ~4h. We are planning to bring partial capacity for public repositories back online shortly.Sep 9, 03:13 UTC Update - We've resumed running private builds at this time. We'll provide further updates on the overall progress tomorrow morning PDT . Thank you for your patience.Sep 9, 01:28 UTC Update - We ran into an issue with booting Xcode 8.x images, so all builds are suspended again. We'll update when private builds are running.Sep 9, 01:04 UTC Update - In order to help things become stable and reliable going forward, we're undertaking intense cleanup of our SAN filesystem. This cleanup is likely to take all weekend. Because of this, we're only able to resume a portion of our capacity for private builds and will not be resuming shared public builds yet. We do not currently have an ETA for when we'll be able to resume shared public builds. We will provide our next update in the morning PDT . We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 21:29 UTC Update - We're working on stabilization cleanup for our SAN storage. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 19:52 UTC Update - We're continuing to work on getting things into a stable state where we can potentially start running builds. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 16:52 UTC Update - We've rebooted our vCenters and continue to work on stabilizing things. All macOS builds remain stopped thank you for patience.Sep 8, 15:01 UTC Update - macOS jobs for public and private repositories builds are stopped. We're currently working with our infrastructure provider to reboot one of our vCenter instances to work out unresponsive SAN issues.Sep 8, 12:34 UTC Update - We are stopping all macOS jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 9, 14:32 UTC Update - Capacity for macOS public repositories has been back online for ~1 hr. We're bumping additional capacity to work through the backlog.Sep 9, 13:04 UTC Update - The backlog for private repository builds has been clear for ~4h. We are planning to bring partial capacity for public repositories back online shortly.Sep 9, 03:13 UTC Update - We've resumed running private builds at this time. We'll provide further updates on the overall progress tomorrow morning PDT . Thank you for your patience.Sep 9, 01:28 UTC Update - We ran into an issue with booting Xcode 8.x images, so all builds are suspended again. We'll update when private builds are running.Sep 9, 01:04 UTC Update - In order to help things become stable and reliable going forward, we're undertaking intense cleanup of our SAN filesystem. This cleanup is likely to take all weekend. Because of this, we're only able to resume a portion of our capacity for private builds and will not be resuming shared public builds yet. We do not currently have an ETA for when we'll be able to resume shared public builds. We will provide our next update in the morning PDT . We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 21:29 UTC Update - We're working on stabilization cleanup for our SAN storage. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 19:52 UTC Update - We're continuing to work on getting things into a stable state where we can potentially start running builds. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 16:52 UTC Update - We've rebooted our vCenters and continue to work on stabilizing things. All macOS builds remain stopped thank you for patience.Sep 8, 15:01 UTC Update - macOS jobs for public and private repositories builds are stopped. We're currently working with our infrastructure provider to reboot one of our vCenter instances to work out unresponsive SAN issues.Sep 8, 12:34 UTC Update - We are stopping all macOS jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 9, 13:04 UTC Update - The backlog for private repository builds has been clear for ~4h. We are planning to bring partial capacity for public repositories back online shortly.Sep 9, 03:13 UTC Update - We've resumed running private builds at this time. We'll provide further updates on the overall progress tomorrow morning PDT . Thank you for your patience.Sep 9, 01:28 UTC Update - We ran into an issue with booting Xcode 8.x images, so all builds are suspended again. We'll update when private builds are running.Sep 9, 01:04 UTC Update - In order to help things become stable and reliable going forward, we're undertaking intense cleanup of our SAN filesystem. This cleanup is likely to take all weekend. Because of this, we're only able to resume a portion of our capacity for private builds and will not be resuming shared public builds yet. We do not currently have an ETA for when we'll be able to resume shared public builds. We will provide our next update in the morning PDT . We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 21:29 UTC Update - We're working on stabilization cleanup for our SAN storage. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 19:52 UTC Update - We're continuing to work on getting things into a stable state where we can potentially start running builds. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 16:52 UTC Update - We've rebooted our vCenters and continue to work on stabilizing things. All macOS builds remain stopped thank you for patience.Sep 8, 15:01 UTC Update - macOS jobs for public and private repositories builds are stopped. We're currently working with our infrastructure provider to reboot one of our vCenter instances to work out unresponsive SAN issues.Sep 8, 12:34 UTC Update - We are stopping all macOS jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 9, 03:13 UTC Update - We've resumed running private builds at this time. We'll provide further updates on the overall progress tomorrow morning PDT . Thank you for your patience.Sep 9, 01:28 UTC Update - We ran into an issue with booting Xcode 8.x images, so all builds are suspended again. We'll update when private builds are running.Sep 9, 01:04 UTC Update - In order to help things become stable and reliable going forward, we're undertaking intense cleanup of our SAN filesystem. This cleanup is likely to take all weekend. Because of this, we're only able to resume a portion of our capacity for private builds and will not be resuming shared public builds yet. We do not currently have an ETA for when we'll be able to resume shared public builds. We will provide our next update in the morning PDT . We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 21:29 UTC Update - We're working on stabilization cleanup for our SAN storage. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 19:52 UTC Update - We're continuing to work on getting things into a stable state where we can potentially start running builds. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 16:52 UTC Update - We've rebooted our vCenters and continue to work on stabilizing things. All macOS builds remain stopped thank you for patience.Sep 8, 15:01 UTC Update - macOS jobs for public and private repositories builds are stopped. We're currently working with our infrastructure provider to reboot one of our vCenter instances to work out unresponsive SAN issues.Sep 8, 12:34 UTC Update - We are stopping all macOS jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 9, 01:28 UTC Update - We ran into an issue with booting Xcode 8.x images, so all builds are suspended again. We'll update when private builds are running.Sep 9, 01:04 UTC Update - In order to help things become stable and reliable going forward, we're undertaking intense cleanup of our SAN filesystem. This cleanup is likely to take all weekend. Because of this, we're only able to resume a portion of our capacity for private builds and will not be resuming shared public builds yet. We do not currently have an ETA for when we'll be able to resume shared public builds. We will provide our next update in the morning PDT . We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 21:29 UTC Update - We're working on stabilization cleanup for our SAN storage. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 19:52 UTC Update - We're continuing to work on getting things into a stable state where we can potentially start running builds. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 16:52 UTC Update - We've rebooted our vCenters and continue to work on stabilizing things. All macOS builds remain stopped thank you for patience.Sep 8, 15:01 UTC Update - macOS jobs for public and private repositories builds are stopped. We're currently working with our infrastructure provider to reboot one of our vCenter instances to work out unresponsive SAN issues.Sep 8, 12:34 UTC Update - We are stopping all macOS jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 9, 01:04 UTC Update - In order to help things become stable and reliable going forward, we're undertaking intense cleanup of our SAN filesystem. This cleanup is likely to take all weekend. Because of this, we're only able to resume a portion of our capacity for private builds and will not be resuming shared public builds yet. We do not currently have an ETA for when we'll be able to resume shared public builds. We will provide our next update in the morning PDT . We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 21:29 UTC Update - We're working on stabilization cleanup for our SAN storage. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 19:52 UTC Update - We're continuing to work on getting things into a stable state where we can potentially start running builds. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 16:52 UTC Update - We've rebooted our vCenters and continue to work on stabilizing things. All macOS builds remain stopped thank you for patience.Sep 8, 15:01 UTC Update - macOS jobs for public and private repositories builds are stopped. We're currently working with our infrastructure provider to reboot one of our vCenter instances to work out unresponsive SAN issues.Sep 8, 12:34 UTC Update - We are stopping all macOS jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 8, 21:29 UTC Update - We're working on stabilization cleanup for our SAN storage. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 19:52 UTC Update - We're continuing to work on getting things into a stable state where we can potentially start running builds. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 16:52 UTC Update - We've rebooted our vCenters and continue to work on stabilizing things. All macOS builds remain stopped thank you for patience.Sep 8, 15:01 UTC Update - macOS jobs for public and private repositories builds are stopped. We're currently working with our infrastructure provider to reboot one of our vCenter instances to work out unresponsive SAN issues.Sep 8, 12:34 UTC Update - We are stopping all macOS jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 8, 19:52 UTC Update - We're continuing to work on getting things into a stable state where we can potentially start running builds. At the moment we do not have an ETA for when we will resume builds. We are very sorry for the delays and will update this incident when we know more. Thank you for your patience.Sep 8, 16:52 UTC Update - We've rebooted our vCenters and continue to work on stabilizing things. All macOS builds remain stopped thank you for patience.Sep 8, 15:01 UTC Update - macOS jobs for public and private repositories builds are stopped. We're currently working with our infrastructure provider to reboot one of our vCenter instances to work out unresponsive SAN issues.Sep 8, 12:34 UTC Update - We are stopping all macOS jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 8, 16:52 UTC Update - We've rebooted our vCenters and continue to work on stabilizing things. All macOS builds remain stopped thank you for patience.Sep 8, 15:01 UTC Update - macOS jobs for public and private repositories builds are stopped. We're currently working with our infrastructure provider to reboot one of our vCenter instances to work out unresponsive SAN issues.Sep 8, 12:34 UTC Update - We are stopping all macOS jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 8, 15:01 UTC Update - macOS jobs for public and private repositories builds are stopped. We're currently working with our infrastructure provider to reboot one of our vCenter instances to work out unresponsive SAN issues.Sep 8, 12:34 UTC Update - We are stopping all macOS jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 8, 12:34 UTC Update - We are stopping all macOS jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 8, 12:34 UTC Update - We are stopping all jobs because we have run out of space on our datacenter's SAN.Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 8, 11:07 UTC Identified - We've identified an issue with some of our Xcode image hosts, causing macOS requeues on both public and private repositories. We're working together with our upstream provider to sort this out while we continue investigating macOS build timeouts.Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 8, 10:32 UTC Investigating - We continue investigating mac OS requeues and build timeouts for both public and private repositories. This seems to be related to SAN performance, we'll continue posting updates as we work to get a more stable performance.Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 8, 08:56 UTC Monitoring - The stability of our macOS builds seems to have improved. We will continue to monitor the rate of errored builds. Thank you for you patience.Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Build delays due to GitHub outage

Sep 8, 02:27 UTC Resolved - We have fully recovered from the GitHub outage.Sep 7, 23:57 UTC Monitoring - We have mostly recovered from the GitHub outage, except for a draining user sync queue.Sep 7, 21:23 UTC Identified - We are currently experiencing delayed builds to GitHub delivering atypically few webhooks.

Travis-CI
Build delays due to GitHub outage

Sep 7, 23:57 UTC Monitoring - We have mostly recovered from the GitHub outage, except for a draining user sync queue.Sep 7, 21:23 UTC Identified - We are currently experiencing delayed builds to GitHub delivering atypically few webhooks.

Travis-CI
Build delays due to GitHub outage

Sep 7, 21:23 UTC Identified - We are currently experiencing delayed builds to GitHub delivering atypically few webhooks.

Travis-CI
Increased error rates on macOS builds

Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing an increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 7, 09:36 UTC Investigating - Repositories running on travis-ci.org and travis-ci.com are experiencing in increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Increased error rates on macOS builds

Sep 7, 09:36 UTC Investigating - Users on travis-ci.org and travis-ci.com are experiencing in increase in errored builds. We are investigating and will update as soon as we can.

Travis-CI
Job requeues on macOS builds

Sep 7, 03:01 UTC Resolved - We've gone from approximately 2800 to 1400 builds in the public build backlog, so we're expecting things to clear up by the morning. If things do not and we need to take further action, we will open a new incident.Sep 6, 22:59 UTC Monitoring - We've nearly caught up with the backlog for private builds. The backlog for public builds will likely clear over night and so we'll continue to monitor things into tomorrow and re-evaluate if we feel we need to cancel any builds tomorrow. Thank you for your patience again.Sep 6, 22:01 UTC Update - We're still recovering things. We're catching up on the private repository backlog but still have a very large public repository backlog. We'll provide more updates as things develop. Thank you for your patience.Sep 6, 19:36 UTC Update - We've been able to stabilize things enough that we're bringing on some more capacity. We're still working on stabilizing everything and will provide updates as things developer. Thank you for your patience.Sep 6, 17:45 UTC Update - The previous message about Xcode images being unavailable was incorrect and has been removed.Sep 6, 17:44 UTC Identified - The host that owns several of our Xcode images has gone offline. We will be shutting down 50% of our capacity to perform emergency maintenance. Expecting longer wait times for OSX/MacOS builds. Sorry for the inconvenience.Sep 6, 15:15 UTC Update - We continue to work towards to clearing the macOS backlog and stabilising our network.Sep 6, 10:45 UTC Update - In addition to longer boot times, users are experiencing an increase in errored builds due to log timeouts when running macOS builds. We are investigating networking issues and will update again as soon as we know more.Sep 6, 09:00 UTC Investigating - We’re investigating an increased rate of internal restarts of macOS builds, resulting in longer boot times for both public and private repositories. This has resulted in an increased backlog for macOS builds at travis-ci.org

Travis-CI
Job requeues on macOS builds

Sep 6, 22:59 UTC Monitoring - We've nearly caught up with the backlog for private builds. The backlog for public builds will likely clear over night and so we'll continue to monitor things into tomorrow and re-evaluate if we feel we need to cancel any builds tomorrow. Thank you for your patience again.Sep 6, 22:01 UTC Update - We're still recovering things. We're catching up on the private repository backlog but still have a very large public repository backlog. We'll provide more updates as things develop. Thank you for your patience.Sep 6, 19:36 UTC Update - We've been able to stabilize things enough that we're bringing on some more capacity. We're still working on stabilizing everything and will provide updates as things developer. Thank you for your patience.Sep 6, 17:45 UTC Update - The previous message about Xcode images being unavailable was incorrect and has been removed.Sep 6, 17:44 UTC Identified - The host that owns several of our Xcode images has gone offline. We will be shutting down 50% of our capacity to perform emergency maintenance. Expecting longer wait times for OSX/MacOS builds. Sorry for the inconvenience.Sep 6, 15:15 UTC Update - We continue to work towards to clearing the macOS backlog and stabilising our network.Sep 6, 10:45 UTC Update - In addition to longer boot times, users are experiencing an increase in errored builds due to log timeouts when running macOS builds. We are investigating networking issues and will update again as soon as we know more.Sep 6, 09:00 UTC Investigating - We’re investigating an increased rate of internal restarts of macOS builds, resulting in longer boot times for both public and private repositories. This has resulted in an increased backlog for macOS builds at travis-ci.org

Travis-CI
Job requeues on macOS builds

Sep 6, 22:01 UTC Update - We're still recovering things. We're catching up on the private repository backlog but still have a very large public repository backlog. We'll provide more updates as things develop. Thank you for your patience.Sep 6, 19:36 UTC Update - We've been able to stabilize things enough that we're bringing on some more capacity. We're still working on stabilizing everything and will provide updates as things developer. Thank you for your patience.Sep 6, 17:45 UTC Update - The previous message about Xcode images being unavailable was incorrect and has been removed.Sep 6, 17:44 UTC Identified - The host that owns several of our Xcode images has gone offline. We will be shutting down 50% of our capacity to perform emergency maintenance. Expecting longer wait times for OSX/MacOS builds. Sorry for the inconvenience.Sep 6, 15:15 UTC Update - We continue to work towards to clearing the macOS backlog and stabilising our network.Sep 6, 10:45 UTC Update - In addition to longer boot times, users are experiencing an increase in errored builds due to log timeouts when running macOS builds. We are investigating networking issues and will update again as soon as we know more.Sep 6, 09:00 UTC Investigating - We’re investigating an increased rate of internal restarts of macOS builds, resulting in longer boot times for both public and private repositories. This has resulted in an increased backlog for macOS builds at travis-ci.org

Travis-CI
Job requeues on macOS builds

Sep 6, 19:36 UTC Update - We've been able to stabilize things enough that we're bringing on some more capacity. We're still working on stabilizing everything and will provide updates as things developer. Thank you for your patience.Sep 6, 17:45 UTC Update - The previous message about Xcode images being unavailable was incorrect and has been removed.Sep 6, 17:44 UTC Identified - The host that owns several of our Xcode images has gone offline. We will be shutting down 50% of our capacity to perform emergency maintenance. Expecting longer wait times for OSX/MacOS builds. Sorry for the inconvenience.Sep 6, 15:15 UTC Update - We continue to work towards to clearing the macOS backlog and stabilising our network.Sep 6, 10:45 UTC Update - In addition to longer boot times, users are experiencing an increase in errored builds due to log timeouts when running macOS builds. We are investigating networking issues and will update again as soon as we know more.Sep 6, 09:00 UTC Investigating - We’re investigating an increased rate of internal restarts of macOS builds, resulting in longer boot times for both public and private repositories. This has resulted in an increased backlog for macOS builds at travis-ci.org

Travis-CI
Job requeues on macOS builds

Sep 6, 17:45 UTC Update - Xcode 8.2, 8.3, and 9 images are completely unavailable during this partial outage for OSX/MacOS builds.Sep 6, 17:44 UTC Identified - The host that owns several of our Xcode images has gone offline. We will be shutting down 50% of our capacity to perform emergency maintenance. Expecting longer wait times for OSX/MacOS builds. Sorry for the inconvenience.Sep 6, 15:15 UTC Update - We continue to work towards to clearing the macOS backlog and stabilising our network.Sep 6, 10:45 UTC Update - In addition to longer boot times, users are experiencing an increase in errored builds due to log timeouts when running macOS builds. We are investigating networking issues and will update again as soon as we know more.Sep 6, 09:00 UTC Investigating - We’re investigating an increased rate of internal restarts of macOS builds, resulting in longer boot times for both public and private repositories. This has resulted in an increased backlog for macOS builds at travis-ci.org

Travis-CI
Job requeues on macOS builds

Sep 6, 17:44 UTC Identified - The host that owns several of our Xcode images has gone offline. We will be shutting down 50% of our capacity to perform emergency maintenance. Expecting longer wait times for OSX/MacOS builds. Sorry for the inconvenience.Sep 6, 15:15 UTC Update - We continue to work towards to clearing the macOS backlog and stabilising our network.Sep 6, 10:45 UTC Update - In addition to longer boot times, users are experiencing an increase in errored builds due to log timeouts when running macOS builds. We are investigating networking issues and will update again as soon as we know more.Sep 6, 09:00 UTC Investigating - We’re investigating an increased rate of internal restarts of macOS builds, resulting in longer boot times for both public and private repositories. This has resulted in an increased backlog for macOS builds at travis-ci.org

Travis-CI
Job requeues on macOS builds

Sep 6, 15:15 UTC Update - We continue to work towards to clearing the macOS backlog and stabilising our network.Sep 6, 10:45 UTC Update - In addition to longer boot times, users are experiencing an increase in errored builds due to log timeouts when running macOS builds. We are investigating networking issues and will update again as soon as we know more.Sep 6, 09:00 UTC Investigating - We’re investigating an increased rate of internal restarts of macOS builds, resulting in longer boot times for both public and private repositories. This has resulted in an increased backlog for macOS builds at travis-ci.org

Travis-CI
Job requeues on macOS builds

Sep 6, 10:45 UTC Update - In addition to longer boot times, users are experiencing an increase in errored builds due to log timeouts when running macOS builds. We are investigating networking issues and will update again as soon as we know more.Sep 6, 09:00 UTC Investigating - We’re investigating an increased rate of internal restarts of macOS builds, resulting in longer boot times for both public and private repositories. This has resulted in an increased backlog for macOS builds at travis-ci.org

Travis-CI
Job requeues on macOS builds

Sep 6, 09:00 UTC Investigating - We’re investigating an increased rate of internal restarts of macOS builds, resulting in longer boot times for both public and private repositories. This has resulted in an increased backlog for macOS builds at travis-ci.org

Travis-CI
Reduced OSX capacity on .com

Aug 23, 21:11 UTC Resolved - The backlog for OSX private .com builds has cleared.Aug 23, 15:59 UTC Monitoring - OSX infrastructure operating at full capacity. Jobs are processing normally, thank you for your patience as we clear the backlog.Aug 23, 15:21 UTC Update - We discovered several of our DCHP hosts for OSX builds were down. This has had cascading effects on resources, ultimately requiring us to stop jobs while we work to stabilize our infrastructure. We will bring hosts back up one-by-one and start up job VMs momentarily.Aug 23, 13:21 UTC Identified - To address the infrastructure instability and reduced capacity issues on OSX builds, we need to do an emergency maintenance bringing all running OSX builds down. The jobs will be restarted as soon as there is a slot free.Aug 23, 09:12 UTC Investigating - We are seeing a decreased capacity for OSX builds on .com.

Travis-CI
Reduced OSX capacity on .com

Aug 23, 15:59 UTC Monitoring - OSX infrastructure operating at full capacity. Jobs are processing normally, thank you for your patience as we clear the backlog.Aug 23, 15:21 UTC Update - We discovered several of our DCHP hosts for OSX builds were down. This has had cascading effects on resources, ultimately requiring us to stop jobs while we work to stabilize our infrastructure. We will bring hosts back up one-by-one and start up job VMs momentarily.Aug 23, 13:21 UTC Identified - To address the infrastructure instability and reduced capacity issues on OSX builds, we need to do an emergency maintenance bringing all running OSX builds down. The jobs will be restarted as soon as there is a slot free.Aug 23, 09:12 UTC Investigating - We are seeing a decreased capacity for OSX builds on .com.

Travis-CI
Reduced OSX capacity on .com

Aug 23, 15:21 UTC Update - We discovered several of our DCHP hosts for OSX builds were down. This has had cascading effects on resources, ultimately requiring us to stop jobs while we work to stabilize our infrastructure. We will bring hosts back up one-by-one and start up job VMs momentarily.Aug 23, 13:21 UTC Identified - To address the infrastructure instability and reduced capacity issues on OSX builds, we need to do an emergency maintenance bringing all running OSX builds down. The jobs will be restarted as soon as there is a slot free.Aug 23, 09:12 UTC Investigating - We are seeing a decreased capacity for OSX builds on .com.

Travis-CI
Reduced OSX capacity on .com

Aug 23, 13:21 UTC Identified - To address the infrastructure instability and reduced capacity issues on OSX builds, we need to do an emergency maintenance bringing all running OSX builds down. The jobs will be restarted as soon as there is a slot free.Aug 23, 09:12 UTC Investigating - We are seeing a decreased capacity for OSX builds on .com.

Travis-CI
Reduced OSX capacity on .com

Aug 23, 13:21 UTC Identified - To address the infrastructure instability and reduced capacity issues on OSX builds, we need to do an emergency maintenance bringing all running OSX builds down. The jobs will be restarted as soon as there is a slot free.Aug 23, 09:12 UTC Investigating - We are seeing a decreased capacity for OSX builds on .com.

Travis-CI
Reduced OSX capacity on .com

Aug 23, 09:12 UTC Investigating - We are seeing a decreased capacity for OSX builds on .com.

Travis-CI
Build requests and sign in via GitHub slow or unavailable

Aug 21, 17:14 UTC Resolved - There is a backlog remaining for macOS public repositories that is typical for this time of day/week. Thanks for your patience!Aug 21, 15:29 UTC Monitoring - Build queues on our docker infrastructure have cleared. Mac queues continue to experience delays.Aug 21, 15:03 UTC Update - We have finished processing the backlog of build requests and are in the process of scaling out extra capacity to deal with the influx of builds.Aug 21, 14:31 UTC Update - GitHub have identified and addressed the source of connectivity issues. We are beginning to process the backlog of build requests.Aug 21, 13:25 UTC Identified - We are seeing increased response times on GitHub’s API in several components, causing sign-in on Travis CI to fail, build requests being delayed, and syncing accounts being slow.

Travis-CI
Build requests and sign in via GitHub slow or unavailable

Aug 21, 15:29 UTC Monitoring - Build queues on our docker infrastructure have cleared. Mac queues continue to experience delays.Aug 21, 15:03 UTC Update - We have finished processing the backlog of build requests and are in the process of scaling out extra capacity to deal with the influx of builds.Aug 21, 14:31 UTC Update - GitHub have identified and addressed the source of connectivity issues. We are beginning to process the backlog of build requests.Aug 21, 13:25 UTC Identified - We are seeing increased response times on GitHub’s API in several components, causing sign-in on Travis CI to fail, build requests being delayed, and syncing accounts being slow.

Travis-CI
Build requests and sign in via GitHub slow or unavailable

Aug 21, 15:03 UTC Update - We have finished processing the backlog of build requests and are in the process of scaling out extra capacity to deal with the influx of builds.Aug 21, 14:31 UTC Update - GitHub have identified and addressed the source of connectivity issues. We are beginning to process the backlog of build requests.Aug 21, 13:25 UTC Identified - We are seeing increased response times on GitHub’s API in several components, causing sign-in on Travis CI to fail, build requests being delayed, and syncing accounts being slow.

Travis-CI
Build requests and sign in via GitHub slow or unavailable

Aug 21, 14:31 UTC Update - GitHub have identified and addressed the source of connectivity issues. We are beginning to process the backlog of build requests.Aug 21, 13:25 UTC Identified - We are seeing increased response times on GitHub’s API in several components, causing sign-in on Travis CI to fail, build requests being delayed, and syncing accounts being slow.

Travis-CI
Build requests and sign in via GitHub slow or unavailable

Aug 21, 13:25 UTC Identified - We are seeing increased response times on GitHub’s API in several components, causing sign-in on Travis CI to fail, build requests being delayed, and syncing accounts being slow.

Travis-CI
Build requests and sign in via GitHub slow or unavailable

Aug 21, 13:25 UTC Identified - We are seeing increased response times on GitHub’s API in several components, causing sign-in on Travis CI to fail, build requests being delayed, and syncing accounts being slow.

Travis-CI
[Planned] macOS Infrastructure Network Maintenance

Aug 17, 01:48 UTC Completed - Maintenance has been completed.Aug 17, 01:01 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Aug 11, 17:32 UTC Scheduled - We'll be implementing and testing some changes to a portion our macOS networking infrastructure, to help improve build performance. During this maintenance users will experience reduced build capacity for both public and private repository builds. We do not expect needing to take things entirely offline. If you have any questions, please email support@travis-ci.com

Travis-CI
[Planned] macOS Infrastructure Network Maintenance

Aug 17, 01:01 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Aug 11, 17:32 UTC Scheduled - We'll be implementing and testing some changes to a portion our macOS networking infrastructure, to help improve build performance. During this maintenance users will experience reduced build capacity for both public and private repository builds. We do not expect needing to take things entirely offline. If you have any questions, please email support@travis-ci.com

Travis-CI
[Planned] macOS Infrastructure Network Maintenance

Aug 11, 17:32 UTC Scheduled - We'll be implementing and testing some changes to a portion our macOS networking infrastructure, to help improve build performance. During this maintenance users will experience reduced build capacity for both public and private repository builds. We do not expect needing to take things entirely offline. If you have any questions, please email support@travis-ci.com

Travis-CI
Partial Reduction on Capacity for Private MacOS builds.

Aug 9, 20:16 UTC Resolved - At this time things are operating in a stable fashion. Please email support@travis-ci.com if you are still seeing any issues.Aug 9, 19:16 UTC Monitoring - A fix has been implemented and we are monitoring the results.Aug 9, 19:12 UTC Update - We've been able to remove the problem host and have restored full capacity for private macOS builds. We are monitoring things closely.Aug 9, 18:31 UTC Identified - We are responding to downed hosts servicing our private MacOS builds. Builds operating at reduced capacity, some wait time is expected.

Travis-CI
Partial Reduction on Capacity for Private MacOS builds.

Aug 9, 19:16 UTC Monitoring - A fix has been implemented and we are monitoring the results.Aug 9, 19:12 UTC Update - We've been able to remove the problem host and have restored full capacity for private macOS builds. We are monitoring things closely.Aug 9, 18:31 UTC Identified - We are responding to downed hosts servicing our private MacOS builds. Builds operating at reduced capacity, some wait time is expected.

Travis-CI
Partial Reduction on Capacity for Private MacOS builds.

Aug 9, 19:12 UTC Update - We've been able to remove the problem host and have restored full capacity for private macOS builds. We are monitoring things closely.Aug 9, 18:31 UTC Identified - We are responding to downed hosts servicing our private MacOS builds. Builds operating at reduced capacity, some wait time is expected.

Travis-CI
Partial Reduction on Capacity for Private MacOS builds.

Aug 9, 18:31 UTC Identified - We are responding to downed hosts servicing our private MacOS builds. Builds operating at reduced capacity, some wait time is expected.

Travis-CI
Partial Reduction on Capacity for Private MacOS builds.

Aug 9, 18:31 UTC Identified - We are responding to downed hosts servicing our private MacOS builds. Builds operating at reduced capacity, some wait time is expected.

Travis-CI
Reduced macOS capacity for public and private builds.

Aug 9, 02:15 UTC Resolved - We're running production builds again at full capacity. We've accrued a large backlog of public macOS builds and we're processing them, but it will be a few hours before the backlog is cleared. We'll continue to monitor things closely. Thank you for your patience while we worked to resolve this incident.Aug 9, 01:57 UTC Update - We believe we've identified the issue and are beginning to test running jobs in the portion of our infrastructure that was having issues. We'll provide an update as we start to run production builds through it.Aug 9, 01:30 UTC Update - We've begun experiencing a new set of errors that is preventing us from restoring full capacity. We are investigating.Aug 9, 01:03 UTC Update - We've been able to get things cleanly restarted and we're beginning to ramp back up to full capacity. We'll post an update once we've ramped back up. Thanks for your patience.Aug 9, 00:14 UTC Update - We're, unfortunately, running into some unexpected errors and are working to resolve them. We'll provide another update within 60 minutes.Aug 8, 23:24 UTC Update - We're in the process of restarting some components. We'll provide another update within 60 minutes.Aug 8, 22:33 UTC Update - We're continuing to work on stabilizing things. We'll provide another update within 60 minutes.Aug 8, 21:42 UTC Identified - We're currently working to stabilize things. We'll provide another update within 60 minutes. Thank you for your patience.Aug 8, 21:09 UTC Investigating - We are currently seeing instability in part of our macOS infrastructure. This is resulting in delays for both public and private macOS builds. We're investigating the issue.

Travis-CI
Reduced macOS capacity for public and private builds.

Aug 9, 01:57 UTC Update - We believe we've identified the issue and are beginning to test running jobs in the portion of our infrastructure that was having issues. We'll provide an update as we start to run production builds through it.Aug 9, 01:30 UTC Update - We've begun experiencing a new set of errors that is preventing us from restoring full capacity. We are investigating.Aug 9, 01:03 UTC Update - We've been able to get things cleanly restarted and we're beginning to ramp back up to full capacity. We'll post an update once we've ramped back up. Thanks for your patience.Aug 9, 00:14 UTC Update - We're, unfortunately, running into some unexpected errors and are working to resolve them. We'll provide another update within 60 minutes.Aug 8, 23:24 UTC Update - We're in the process of restarting some components. We'll provide another update within 60 minutes.Aug 8, 22:33 UTC Update - We're continuing to work on stabilizing things. We'll provide another update within 60 minutes.Aug 8, 21:42 UTC Identified - We're currently working to stabilize things. We'll provide another update within 60 minutes. Thank you for your patience.Aug 8, 21:09 UTC Investigating - We are currently seeing instability in part of our macOS infrastructure. This is resulting in delays for both public and private macOS builds. We're investigating the issue.

Travis-CI
Reduced macOS capacity for public and private builds.

Aug 9, 01:30 UTC Update - We've begun experiencing a new set of errors that is preventing us from restoring full capacity. We are investigating.Aug 9, 01:03 UTC Update - We've been able to get things cleanly restarted and we're beginning to ramp back up to full capacity. We'll post an update once we've ramped back up. Thanks for your patience.Aug 9, 00:14 UTC Update - We're, unfortunately, running into some unexpected errors and are working to resolve them. We'll provide another update within 60 minutes.Aug 8, 23:24 UTC Update - We're in the process of restarting some components. We'll provide another update within 60 minutes.Aug 8, 22:33 UTC Update - We're continuing to work on stabilizing things. We'll provide another update within 60 minutes.Aug 8, 21:42 UTC Identified - We're currently working to stabilize things. We'll provide another update within 60 minutes. Thank you for your patience.Aug 8, 21:09 UTC Investigating - We are currently seeing instability in part of our macOS infrastructure. This is resulting in delays for both public and private macOS builds. We're investigating the issue.

Travis-CI
Reduced macOS capacity for public and private builds.

Aug 9, 01:03 UTC Update - We've been able to get things cleanly restarted and we're beginning to ramp back up to full capacity. We'll post an update once we've ramped back up. Thanks for your patience.Aug 9, 00:14 UTC Update - We're, unfortunately, running into some unexpected errors and are working to resolve them. We'll provide another update within 60 minutes.Aug 8, 23:24 UTC Update - We're in the process of restarting some components. We'll provide another update within 60 minutes.Aug 8, 22:33 UTC Update - We're continuing to work on stabilizing things. We'll provide another update within 60 minutes.Aug 8, 21:42 UTC Identified - We're currently working to stabilize things. We'll provide another update within 60 minutes. Thank you for your patience.Aug 8, 21:09 UTC Investigating - We are currently seeing instability in part of our macOS infrastructure. This is resulting in delays for both public and private macOS builds. We're investigating the issue.

Travis-CI
Reduced macOS capacity for public and private builds.

Aug 9, 00:14 UTC Update - We're, unfortunately, running into some unexpected errors and are working to resolve them. We'll provide another update within 60 minutes.Aug 8, 23:24 UTC Update - We're in the process of restarting some components. We'll provide another update within 60 minutes.Aug 8, 22:33 UTC Update - We're continuing to work on stabilizing things. We'll provide another update within 60 minutes.Aug 8, 21:42 UTC Identified - We're currently working to stabilize things. We'll provide another update within 60 minutes. Thank you for your patience.Aug 8, 21:09 UTC Investigating - We are currently seeing instability in part of our macOS infrastructure. This is resulting in delays for both public and private macOS builds. We're investigating the issue.

Travis-CI
Reduced macOS capacity for public and private builds.

Aug 8, 23:24 UTC Update - We're in the process of restarting some components. We'll provide another update within 60 minutes.Aug 8, 22:33 UTC Update - We're continuing to work on stabilizing things. We'll provide another update within 60 minutes.Aug 8, 21:42 UTC Identified - We're currently working to stabilize things. We'll provide another update within 60 minutes. Thank you for your patience.Aug 8, 21:09 UTC Investigating - We are currently seeing instability in part of our macOS infrastructure. This is resulting in delays for both public and private macOS builds. We're investigating the issue.

Travis-CI
Reduced macOS capacity for public and private builds.

Aug 8, 22:33 UTC Update - We're continuing to work on stabilizing things. We'll provide another update within 60 minutes.Aug 8, 21:42 UTC Identified - We're currently working to stabilize things. We'll provide another update within 60 minutes. Thank you for your patience.Aug 8, 21:09 UTC Investigating - We are currently seeing instability in part of our macOS infrastructure. This is resulting in delays for both public and private macOS builds. We're investigating the issue.

Travis-CI
Reduced macOS capacity for public and private builds.

Aug 8, 21:42 UTC Identified - We're currently working to stabilize things. We'll provide another update within 60 minutes. Thank you for your patience.Aug 8, 21:09 UTC Investigating - We are currently seeing instability in part of our macOS infrastructure. This is resulting in delays for both public and private macOS builds. We're investigating the issue.

Travis-CI
Reduced macOS capacity for public and private builds.

Aug 8, 21:09 UTC Investigating - We are currently seeing instability in part of our macOS infrastructure. This is resulting in delays for both public and private macOS builds. We're investigating the issue.

Travis-CI
Delays receiving events from GitHub for both public and private repositories

Jul 31, 20:41 UTC Resolved - Our GitHub sync queues have drained, everything is operating normally. ✨Jul 31, 18:18 UTC Update - We have resumed regular service as GitHub recovers. We’re processing a backlog of GitHub sync requests.Jul 31, 17:32 UTC Monitoring - GitHub is returning to normal service, we’ll continue to monitor the situation.Jul 31, 17:01 UTC Identified - GitHub’s currently experiencing a major service outage, we’re monitoring the situation closely.

Travis-CI
Delays receiving events from GitHub for both public and private repositories

Jul 31, 18:18 UTC Update - We have resumed regular service as GitHub recovers. We’re processing a backlog of GitHub sync requests.Jul 31, 17:32 UTC Monitoring - GitHub is returning to normal service, we’ll continue to monitor the situation.Jul 31, 17:01 UTC Identified - GitHub’s currently experiencing a major service outage, we’re monitoring the situation closely.

Travis-CI
Delays receiving events from GitHub for both public and private repositories

Jul 31, 17:32 UTC Monitoring - GitHub is returning to normal service, we’ll continue to monitor the situation.Jul 31, 17:01 UTC Identified - GitHub’s currently experiencing a major service outage, we’re monitoring the situation closely.

Travis-CI
Delays receiving events from GitHub for both public and private repositories

Jul 31, 17:01 UTC Identified - GitHub’s currently experiencing a major service outage, we’re monitoring the situation closely.

Travis-CI
Delays for private and open-source container-based builds

Jul 27, 15:46 UTC Resolved - Backlogs have cleared.Jul 27, 15:17 UTC Monitoring - An EC2 network outage impacted our capacity, which has created a backlog for builds in our container-based infrastructure. We are adding capacity in order to work through this backlog more quickly.

Travis-CI
Delays for private and open-source container-based builds

Jul 27, 15:17 UTC Monitoring - An EC2 network outage impacted our capacity, which has created a backlog for builds in our container-based infrastructure. We are adding capacity in order to work through this backlog more quickly.

Travis-CI
Delays for private and open-source container-based builds

Jul 27, 15:17 UTC Monitoring - An EC2 network outage impacted our capacity, which has created a backlog for builds in our container-based infrastructure. We are adding capacity in order to work through this backlog more quickly.

Travis-CI
OSX builds routed to Linux images

Jul 19, 13:01 UTC Resolved - The regression bug that was introduced has now been fixed, builds are routing as intended.Jul 19, 12:04 UTC Monitoring - Due to a regression, some OSX builds have briefly been routed to Linux images. This has change been reverted. We are closely monitoring the situation. Restarting a affected jobs should run the build on the right infrastructure. We apologize for the inconveniences caused.

Travis-CI
OSX builds routed to Linux images

Jul 19, 12:04 UTC Monitoring - Due to a regression, some OSX builds have briefly been routed to Linux images. This has change been reverted. We are closely monitoring the situation. Restarting a affected jobs should run the build on the right infrastructure. We apologize for the inconveniences caused.

Travis-CI
Reduced capacity for macOS builds in travis-ci.com and travis-ci.org

Jul 13, 17:54 UTC Resolved - At this time we've cleared the backlog for travis-ci.com builds. There is still a large backlog for travis-ci.org builds, but it's not higher than what we've been seeing this week and it'll clear as usage tapers off later today. So we are resolving this incident.Jul 13, 15:36 UTC Monitoring - The macOS capacity has been restored in travis-ci.com and travis-ci.org. We are closely monitoring the situation and we are processing the macOS build backlog.Jul 13, 15:19 UTC Identified - We have identified the problem and we’re working together with our upstream provider to bring back the capacity to our macOS infrastructure.Jul 13, 14:43 UTC Investigating - We’re currently investigating reduced capacity in our macOS infrastructure. Build delays are expected. Thank you for your patience.

Travis-CI
Reduced capacity for macOS builds in travis-ci.com and travis-ci.org

Jul 13, 15:36 UTC Monitoring - The macOS capacity has been restored in travis-ci.com and travis-ci.org. We are closely monitoring the situation and we are processing the macOS build backlog.Jul 13, 15:19 UTC Identified - We have identified the problem and we’re working together with our upstream provider to bring back the capacity to our macOS infrastructure.Jul 13, 14:43 UTC Investigating - We’re currently investigating reduced capacity in our macOS infrastructure. Build delays are expected. Thank you for your patience.

Travis-CI
Reduced capacity for macOS builds in travis-ci.com and travis-ci.org

Jul 13, 15:19 UTC Identified - We have identified the problem and we’re working together with our upstream provider to bring back the capacity to our macOS infrastructure.Jul 13, 14:43 UTC Investigating - We’re currently investigating reduced capacity in our macOS infrastructure. Build delays are expected. Thank you for your patience.

Travis-CI
Reduced capacity for macOS builds in travis-ci.com and travis-ci.org

Jul 13, 14:43 UTC Investigating - We’re currently investigating reduced capacity in our macOS infrastructure. Build delays are expected. Thank you for your patience.

Travis-CI
macOS network outage

Jul 13, 02:31 UTC Resolved - We have caught up with the back logs.Jul 12, 18:16 UTC Update - The upstream network issue has been resolved. We are processing back logs.Jul 12, 14:47 UTC Monitoring - The network seems to be back. We’re beginning to process the macOS backlog and and are monitoring the situation closely.Jul 12, 13:10 UTC Update - We’re experiencing another network outage which is interrupting macOS builds. Our upstream provider is investigating as well http://status.macstadium.com/incidents/p584yykj95wnJul 12, 12:55 UTC Identified - We experienced a brief network outage which interrupted all macOS builds. Things seem to be recovering. We are monitoring things closely and working on getting more information about what the cause of the network issue was.

Travis-CI
macOS network outage

Jul 12, 18:16 UTC Update - The upstream network issue has been resolved. We are processing back logs.Jul 12, 14:47 UTC Monitoring - The network seems to be back. We’re beginning to process the macOS backlog and and are monitoring the situation closely.Jul 12, 13:10 UTC Update - We’re experiencing another network outage which is interrupting macOS builds. Our upstream provider is investigating as well http://status.macstadium.com/incidents/p584yykj95wnJul 12, 12:55 UTC Identified - We experienced a brief network outage which interrupted all macOS builds. Things seem to be recovering. We are monitoring things closely and working on getting more information about what the cause of the network issue was.

Travis-CI
macOS network outage

Jul 12, 14:47 UTC Monitoring - The network seems to be back. We’re beginning to process the macOS backlog and and are monitoring the situation closely.Jul 12, 13:10 UTC Update - We’re experiencing another network outage which is interrupting macOS builds. Our upstream provider is investigating as well http://status.macstadium.com/incidents/p584yykj95wnJul 12, 12:55 UTC Identified - We experienced a brief network outage which interrupted all macOS builds. Things seem to be recovering. We are monitoring things closely and working on getting more information about what the cause of the network issue was.

Travis-CI
macOS network outage

Jul 12, 13:10 UTC Update - We’re experiencing another network outage which is interrupting macOS builds. Our upstream provider is investigating as well http://status.macstadium.com/incidents/p584yykj95wnJul 12, 12:55 UTC Identified - We experienced a brief network outage which interrupted all macOS builds. Things seem to be recovering. We are monitoring things closely and working on getting more information about what the cause of the network issue was.

Travis-CI
macOS network outage

Jul 12, 12:55 UTC Identified - We experienced a brief network outage which interrupted all macOS builds. Things seem to be recovering. We are monitoring things closely and working on getting more information about what the cause of the network issue was.

Travis-CI
macOS network outage

Jul 12, 12:55 UTC Identified - We experienced a brief network outage which interrupted all macOS builds. Things seem to be recovering. We are monitoring things closely and working on getting more information about what the cause of the network issue was.

Travis-CI
apt-get failures due to outdated GPG key

Jun 29, 23:20 UTC Resolved - The issue has been resolved.Jun 29, 23:06 UTC Monitoring - A hot fix has been deployed to remove the offending apt source. If your build does not need this, `apt-get` commands should now succeed. See https://github.com/travis-ci/travis-ci/issues/8002 for further details.Jun 29, 22:59 UTC Update - We identified an apt source that is missing a GPG key. We will remove this source as an emergency measure to remedy the apt-get failures.Jun 29, 22:42 UTC Identified - We believe we've identified the source of the issue and are working on a fix.Jun 29, 22:23 UTC Investigating - `apt-get` commands are failing due to a missing GPG key. We are investigating.

Travis-CI
apt-get failures due to outdated GPG key

Jun 29, 23:06 UTC Monitoring - A hot fix has been deployed to remove the offending apt source. If your build does not need this, `apt-get` commands should now succeed. See https://github.com/travis-ci/travis-ci/issues/8002 for further details.Jun 29, 22:59 UTC Update - We identified an apt source that is missing a GPG key. We will remove this source as an emergency measure to remedy the apt-get failures.Jun 29, 22:42 UTC Identified - We believe we've identified the source of the issue and are working on a fix.Jun 29, 22:23 UTC Investigating - `apt-get` commands are failing due to a missing GPG key. We are investigating.

Travis-CI
apt-get failures due to outdated GPG key

Jun 29, 22:59 UTC Update - We identified an apt source that is missing a GPG key. We will remove this source as an emergency measure to remedy the apt-get failures.Jun 29, 22:42 UTC Identified - We believe we've identified the source of the issue and are working on a fix.Jun 29, 22:23 UTC Investigating - `apt-get` commands are failing due to a missing GPG key. We are investigating.

Travis-CI
apt-get failures due to outdated GPG key

Jun 29, 22:42 UTC Identified - We believe we've identified the source of the issue and are working on a fix.Jun 29, 22:23 UTC Investigating - `apt-get` commands are failing due to a missing GPG key. We are investigating.

Travis-CI
apt-get failures due to outdated GPG key

Jun 29, 22:23 UTC Investigating - `apt-get` commands are failing due to a missing GPG key. We are investigating.

Travis-CI
Delays starting builds for public repositories

Jun 29, 16:13 UTC Resolved - Linux build backlogs have cleared. Builds are processing normally.Jun 29, 15:16 UTC Monitoring - Builds affected by this issue are now slowly being processed. We’ll continue posting updates on their evolution.Jun 29, 14:43 UTC Identified - We’ve identified an issue with one of our backend applications, that was causing a delay scheduling builds for public repositories. We’ve just fixed this issue and are currently working on scheduling the builds affected by this issue.

Travis-CI
Delays starting builds for public repositories

Jun 29, 15:16 UTC Monitoring - Builds affected by this issue are now slowly being processed. We’ll continue posting updates on their evolution.Jun 29, 14:43 UTC Identified - We’ve identified an issue with one of our backend applications, that was causing a delay scheduling builds for public repositories. We’ve just fixed this issue and are currently working on scheduling the builds affected by this issue.

Travis-CI
Delays starting builds for public repositories

Jun 29, 14:43 UTC Identified - We’ve identified an issue with one of our backend applications, that was causing a delay scheduling builds for public repositories. We’ve just fixed this issue and are currently working on scheduling the builds affected by this issue.

Travis-CI
Delay processing builds public repositories

Jun 29, 13:59 UTC Resolved - The increased backlog has been processed.Jun 29, 13:57 UTC Update - The backlogs have calmed down. We are expecting the backlog of the container Precise builds to be processed in 15 minutes.Jun 29, 13:19 UTC Monitoring - The cause of the delay has been removed. We are monitoring the situation while the accrued backlog is processed.Jun 29, 13:12 UTC Investigating - We are investigating a delay in scheduling build requests for open source builds.

Travis-CI
Delay processing builds public repositories

Jun 29, 13:57 UTC Update - The backlogs have calmed down. We are expecting the backlog of the container Precise builds to be processed in 15 minutes.Jun 29, 13:19 UTC Monitoring - The cause of the delay has been removed. We are monitoring the situation while the accrued backlog is processed.Jun 29, 13:12 UTC Investigating - We are investigating a delay in scheduling build requests for open source builds.

Travis-CI
Delay processing builds public repositories

Jun 29, 13:19 UTC Monitoring - The cause of the delay has been removed. We are monitoring the situation while the accrued backlog is processed.Jun 29, 13:12 UTC Investigating - We are investigating a delay in scheduling build requests for open source builds.

Travis-CI
Delay processing builds public repositories

Jun 29, 13:12 UTC Investigating - We are investigating a delay in scheduling build requests for open source builds.

Travis-CI
Assets not loading on the web frontend

Jun 28, 16:36 UTC Resolved - Our upstream CDN provider has resolved the issue! 🎉Jun 28, 14:21 UTC Monitoring - Our upstream CDN provider has implemented a fix. Our service has recovered. We are monitoring the situation.Jun 28, 14:13 UTC Update - Our upstream CDN provider is working on a fix, and we are seeing some recovery of service.Jun 28, 14:07 UTC Identified - We are investigating reports of assets not loading on our website in some regions. Our upstream CDN provider is aware of the issue, and we are investigating the possibility of a workaround. The web UI is only partially available at this time.

Travis-CI
Assets not loading on the web frontend

Jun 28, 14:21 UTC Monitoring - Our upstream CDN provider has implemented a fix. Our service has recovered. We are monitoring the situation.Jun 28, 14:13 UTC Update - Our upstream CDN provider is working on a fix, and we are seeing some recovery of service.Jun 28, 14:07 UTC Identified - We are investigating reports of assets not loading on our website in some regions. Our upstream CDN provider is aware of the issue, and we are investigating the possibility of a workaround. The web UI is only partially available at this time.

Travis-CI
Assets not loading on the web frontend

Jun 28, 14:13 UTC Update - Our upstream CDN provider is working on a fix, and we are seeing some recovery of service.Jun 28, 14:07 UTC Identified - We are investigating reports of assets not loading on our website in some regions. Our upstream CDN provider is aware of the issue, and we are investigating the possibility of a workaround. The web UI is only partially available at this time.

Travis-CI
Assets not loading on the web frontend

Jun 28, 14:07 UTC Identified - We are investigating reports of assets not loading on our website in some regions. Our upstream CDN provider is aware of the issue, and we are investigating the possibility of a workaround. The web UI is only partially available at this time.

Travis-CI
Apt failures

Jun 28, 12:36 UTC Resolved - The issue has been identified and a hotfix is in place. We are monitoring the situation, builds should be running normally again (a restart should do the trick). Please reach out to support@travis-ci.com if you continue to see apt-get failures.Jun 28, 10:43 UTC Investigating - We are seeing issues with apt-get on our images. We are investigating, and let you know when we know more.

Travis-CI
Apt failures

Jun 28, 10:43 UTC Investigating - We are seeing issues with apt-get on our images. We are investigating, and let you know when we know more.

Travis-CI
Apt failures

Jun 28, 10:43 UTC Investigating - We are seeing issues with apt-get on our images. We are investigating, and let you know when we know more.

Travis-CI
Private Repository RabbitMQ Upgrade

Jun 23, 02:15 UTC Completed - The scheduled maintenance has been completed.Jun 23, 01:53 UTC Verifying - Verification is currently underway for the maintenance items.Jun 23, 01:30 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Jun 22, 23:49 UTC Scheduled - We need to perform an upgrade to the RabbitMQ cluster used by the infrastructure for private repositories. We do not expect any downtime during this upgrade.

Travis-CI
Private Repository RabbitMQ Upgrade

Jun 23, 01:53 UTC Verifying - Verification is currently underway for the maintenance items.Jun 23, 01:30 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Jun 22, 23:49 UTC Scheduled - We need to perform an upgrade to the RabbitMQ cluster used by the infrastructure for private repositories. We do not expect any downtime during this upgrade.

Travis-CI
Private Repository RabbitMQ Upgrade

Jun 23, 01:30 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Jun 22, 23:49 UTC Scheduled - We need to perform an upgrade to the RabbitMQ cluster used by the infrastructure for private repositories. We do not expect any downtime during this upgrade.

Travis-CI
Private Repository RabbitMQ Upgrade

Jun 22, 23:49 UTC Scheduled - We need to perform an upgrade to the RabbitMQ cluster used by the infrastructure for private repositories. We do not expect any downtime during this upgrade.

Travis-CI
Delays processing build logs

Jun 22, 22:16 UTC Resolved - Jobs and logs are processing normally on travis-ci.com.Jun 22, 22:00 UTC Monitoring - Some message queue instability on the ".com" infrastructure has resulted in delays for build log updates and backlog of job execution. We're now processing jobs and logs as expected. We will continue monitoring queues.Jun 22, 21:04 UTC Investigating - We are currently investigating delays processing build logs.

Travis-CI
Delays processing build logs

Jun 22, 22:00 UTC Monitoring - Some message queue instability on the ".com" infrastructure has resulted in delays for build log updates and backlog of job execution. We're now processing jobs and logs as expected. We will continue monitoring queues.Jun 22, 21:04 UTC Investigating - We are currently investigating delays processing build logs.

Travis-CI
Delays processing build logs

Jun 22, 21:04 UTC Investigating - We are currently investigating delays processing build logs.

Travis-CI
Delays processing build logs

Jun 22, 21:04 UTC Investigating - We are currently investigating delays processing build logs.

Travis-CI
Build delays and API issues affecting travis-ci.com

Jun 22, 13:53 UTC Resolved - Log processing has recovered. We are investigating the possibility of some stuck log parts. If you do experience paused or stuck logs, please restart those builds. Thank you for your patience! 💛Jun 22, 13:24 UTC Update - The backlog for private Mac builds has cleared. We continue to process our backlog of log parts.Jun 22, 12:36 UTC Update - Job backlogs for private linux builds have cleared. We continue to process the Mac builds, as well as the log parts.Jun 22, 12:16 UTC Monitoring - Job processing is recovering. We are bringing up extra capacity to process the job backlogs more quickly. We are also processing our backlog of log parts. We're continue to closely monitor the situation.Jun 22, 11:51 UTC Identified - We have identified a correlating issue with one of our RabbitMQ instances. The faulty instance has been restarted, and we are waiting for capacity to fully recover. This has created job delays and backlogs for private builds on all infrastructures.Jun 22, 11:27 UTC Update - We’ve manually bumped our Linux capacity to help private Linux builds process faster while we continue working on logs processing.Jun 22, 11:03 UTC Monitoring - We’ve identified and fixed a memory issue caused by a high spike in our sync queues. Some build request have been lost during this process, please re-push a commit again to ensure your build is triggered. We’re monitoring and builds should be running as expected from now on. We’re also working on processing logs and will give an update as soon as they are at 100%.Jun 22, 10:43 UTC Investigating - We’re investigating API and sync connection issues, causing build delays for private repositories, log in issues and logs display and retrieval issues.

Travis-CI
Build delays and API issues affecting travis-ci.com

Jun 22, 13:24 UTC Update - The backlog for private Mac builds has cleared. We continue to process our backlog of log parts.Jun 22, 12:36 UTC Update - Job backlogs for private linux builds have cleared. We continue to process the Mac builds, as well as the log parts.Jun 22, 12:16 UTC Monitoring - Job processing is recovering. We are bringing up extra capacity to process the job backlogs more quickly. We are also processing our backlog of log parts. We're continue to closely monitor the situation.Jun 22, 11:51 UTC Identified - We have identified a correlating issue with one of our RabbitMQ instances. The faulty instance has been restarted, and we are waiting for capacity to fully recover. This has created job delays and backlogs for private builds on all infrastructures.Jun 22, 11:27 UTC Update - We’ve manually bumped our Linux capacity to help private Linux builds process faster while we continue working on logs processing.Jun 22, 11:03 UTC Monitoring - We’ve identified and fixed a memory issue caused by a high spike in our sync queues. Some build request have been lost during this process, please re-push a commit again to ensure your build is triggered. We’re monitoring and builds should be running as expected from now on. We’re also working on processing logs and will give an update as soon as they are at 100%.Jun 22, 10:43 UTC Investigating - We’re investigating API and sync connection issues, causing build delays for private repositories, log in issues and logs display and retrieval issues.

Travis-CI
Build delays and API issues affecting travis-ci.com

Jun 22, 12:36 UTC Update - Job backlogs for private linux builds have cleared. We continue to process the Mac builds, as well as the log parts.Jun 22, 12:16 UTC Monitoring - Job processing is recovering. We are bringing up extra capacity to process the job backlogs more quickly. We are also processing our backlog of log parts. We're continue to closely monitor the situation.Jun 22, 11:51 UTC Identified - We have identified a correlating issue with one of our RabbitMQ instances. The faulty instance has been restarted, and we are waiting for capacity to fully recover. This has created job delays and backlogs for private builds on all infrastructures.Jun 22, 11:27 UTC Update - We’ve manually bumped our Linux capacity to help private Linux builds process faster while we continue working on logs processing.Jun 22, 11:03 UTC Monitoring - We’ve identified and fixed a memory issue caused by a high spike in our sync queues. Some build request have been lost during this process, please re-push a commit again to ensure your build is triggered. We’re monitoring and builds should be running as expected from now on. We’re also working on processing logs and will give an update as soon as they are at 100%.Jun 22, 10:43 UTC Investigating - We’re investigating API and sync connection issues, causing build delays for private repositories, log in issues and logs display and retrieval issues.

Travis-CI
Build delays and API issues affecting travis-ci.com

Jun 22, 12:16 UTC Monitoring - Job processing is recovering. We are bringing up extra capacity to process the job backlogs more quickly. We are also processing our backlog of log parts. We're continue to closely monitor the situation.Jun 22, 11:51 UTC Identified - We have identified a correlating issue with one of our RabbitMQ instances. The faulty instance has been restarted, and we are waiting for capacity to fully recover. This has created job delays and backlogs for private builds on all infrastructures.Jun 22, 11:27 UTC Update - We’ve manually bumped our Linux capacity to help private Linux builds process faster while we continue working on logs processing.Jun 22, 11:03 UTC Monitoring - We’ve identified and fixed a memory issue caused by a high spike in our sync queues. Some build request have been lost during this process, please re-push a commit again to ensure your build is triggered. We’re monitoring and builds should be running as expected from now on. We’re also working on processing logs and will give an update as soon as they are at 100%.Jun 22, 10:43 UTC Investigating - We’re investigating API and sync connection issues, causing build delays for private repositories, log in issues and logs display and retrieval issues.

Travis-CI
Build delays and API issues affecting travis-ci.com

Jun 22, 11:51 UTC Identified - We have identified a correlating issue with one of our RabbitMQ instances. The faulty instance has been restarted, and we are waiting for capacity to fully recover. This has created job delays and backlogs for private builds on all infrastructures.Jun 22, 11:27 UTC Update - We’ve manually bumped our Linux capacity to help private Linux builds process faster while we continue working on logs processing.Jun 22, 11:03 UTC Monitoring - We’ve identified and fixed a memory issue caused by a high spike in our sync queues. Some build request have been lost during this process, please re-push a commit again to ensure your build is triggered. We’re monitoring and builds should be running as expected from now on. We’re also working on processing logs and will give an update as soon as they are at 100%.Jun 22, 10:43 UTC Investigating - We’re investigating API and sync connection issues, causing build delays for private repositories, log in issues and logs display and retrieval issues.

Travis-CI
Build delays and API issues affecting travis-ci.com

Jun 22, 11:27 UTC Update - We’ve manually bumped our Linux capacity to help private Linux builds process faster while we continue working on logs processing.Jun 22, 11:03 UTC Monitoring - We’ve identified and fixed a memory issue caused by a high spike in our sync queues. Some build request have been lost during this process, please re-push a commit again to ensure your build is triggered. We’re monitoring and builds should be running as expected from now on. We’re also working on processing logs and will give an update as soon as they are at 100%.Jun 22, 10:43 UTC Investigating - We’re investigating API and sync connection issues, causing build delays for private repositories, log in issues and logs display and retrieval issues.

Travis-CI
Build delays and API issues affecting travis-ci.com

Jun 22, 11:03 UTC Monitoring - We’ve identified and fixed a memory issue caused by a high spike in our sync queues. Some build request have been lost during this process, please re-push a commit again to ensure your build is triggered. We’re monitoring and builds should be running as expected from now on. We’re also working on processing logs and will give an update as soon as they are at 100%.Jun 22, 10:43 UTC Investigating - We’re investigating API and sync connection issues, causing build delays for private repositories, log in issues and logs display and retrieval issues.

Travis-CI
Build delays and API issues affecting travis-ci.com

Jun 22, 10:43 UTC Investigating - We’re investigating API and sync connection issues, causing build delays for private repositories, log in issues and logs display and retrieval issues.

Travis-CI
Backlog for macOS and sudo-enabled Linux

Jun 18, 14:02 UTC Resolved - The current macOS backlog for public repositories is at a normal level and should be cleared within the next hour. Thank you for your patience!Jun 18, 13:30 UTC Monitoring - The last remaining backlog is for macOS public repositories.Jun 18, 12:08 UTC Identified - We have rolled back an update that was bundled with last night's maintenance and are already seeing a decline in backlogs.Jun 18, 11:37 UTC Investigating - We are investigating backlogs on macOS and sudo-enabled Linux for both public and private repositories

Travis-CI
Backlog for macOS and sudo-enabled Linux

Jun 18, 13:30 UTC Monitoring - The last remaining backlog is for macOS public repositories.Jun 18, 12:08 UTC Identified - We have rolled back an update that was bundled with last night's maintenance and are already seeing a decline in backlogs.Jun 18, 11:37 UTC Investigating - We are investigating backlogs on macOS and sudo-enabled Linux for both public and private repositories

Travis-CI
Backlog for macOS and sudo-enabled Linux

Jun 18, 12:08 UTC Identified - We have rolled back an update that was bundled with last night's maintenance and are already seeing a decline in backlogs.Jun 18, 11:37 UTC Investigating - We are investigating backlogs on macOS and sudo-enabled Linux for both public and private repositories

Travis-CI
Backlog for macOS and sudo-enabled Linux

Jun 18, 11:37 UTC Investigating - We are investigating backlogs on macOS and sudo-enabled Linux for both public and private repositories

Travis-CI
Database migrations for two backend services

Jun 18, 03:57 UTC Completed - The scheduled maintenance has been completed.Jun 18, 03:49 UTC Verifying - We are in the process of verifying the maintenance, currently investigating heightened AMQP errors on sudo-enabled Linux.Jun 18, 02:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Jun 14, 20:34 UTC Scheduled - We need to migrate two backend services to use newly-provisioned databases so that we can decommission an older, larger, shared database instance. Public and private repositories running on macOS and sudo-enabled Linux will be affected, with no new jobs scheduled while maintenance is underway.

Travis-CI
Database migrations for two backend services

Jun 18, 03:49 UTC Verifying - We are in the process of verifying the maintenance, currently investigating heightened AMQP errors on sudo-enabled Linux.Jun 18, 02:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Jun 14, 20:34 UTC Scheduled - We need to migrate two backend services to use newly-provisioned databases so that we can decommission an older, larger, shared database instance. Public and private repositories running on macOS and sudo-enabled Linux will be affected, with no new jobs scheduled while maintenance is underway.

Travis-CI
Database migrations for two backend services

Jun 18, 02:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Jun 14, 20:34 UTC Scheduled - We need to migrate two backend services to use newly-provisioned databases so that we can decommission an older, larger, shared database instance. Public and private repositories running on macOS and sudo-enabled Linux will be affected, with no new jobs scheduled while maintenance is underway.

Travis-CI
Database migrations for two backend services

Jun 14, 20:34 UTC Scheduled - We need to migrate two backend services to use newly-provisioned databases so that we can decommission an older, larger, shared database instance. Public and private repositories running on macOS and sudo-enabled Linux will be affected, with no new jobs scheduled while maintenance is underway.

Travis-CI
Build delays for private repositories

Jun 14, 22:09 UTC Resolved - The backlog for sudo: required GCE and macOS private builds has cleared.Jun 14, 21:43 UTC Update - Due to resource contention on one of our backend services, the backlog for private mac and sudo enabled builds is taking longer-than-normal to process. Builds are processing at full capacity, and the backlog continues to decrease.Jun 14, 20:03 UTC Update - Private travis-ci.com builds for mac and sudo-enabled trusty/precise are running at full capacity, thank you for your patience as we work through the remaining backlogJun 14, 19:21 UTC Update - The backlog on our container-based Precise infrastructure (i.e. sudo: false + dist: precise) is now cleared.Jun 14, 19:17 UTC Update - The backlog on our container-based Trusty infrastructure (i.e. sudo: false + dist: trusty) is now cleared.Jun 14, 19:15 UTC Monitoring - We are seeing a downward trend in the backlogs of all our infrastructures. We will update when they have cleared. Thank you.Jun 14, 18:48 UTC Identified - We have proceeded to restart a component that was failing to process job requests. Upon the reboot, the jobs now seem to be processing normally. We are monitoring the situation closely.Jun 14, 18:25 UTC Investigating - We are currently seeing delays for builds on private repositories. We are currently escalating the issue with one of our 3rd party provider and will post an update as soon as we know more. Thank you for your patience!

Travis-CI
Build delays for private repositories

Jun 14, 21:43 UTC Update - Due to resource contention on one of our backend services, the backlog for private mac and sudo enabled builds is taking longer-than-normal to process. Builds are processing at full capacity, and the backlog continues to decrease.Jun 14, 20:03 UTC Update - Private travis-ci.com builds for mac and sudo-enabled trusty/precise are running at full capacity, thank you for your patience as we work through the remaining backlogJun 14, 19:21 UTC Update - The backlog on our container-based Precise infrastructure (i.e. sudo: false + dist: precise) is now cleared.Jun 14, 19:17 UTC Update - The backlog on our container-based Trusty infrastructure (i.e. sudo: false + dist: trusty) is now cleared.Jun 14, 19:15 UTC Monitoring - We are seeing a downward trend in the backlogs of all our infrastructures. We will update when they have cleared. Thank you.Jun 14, 18:48 UTC Identified - We have proceeded to restart a component that was failing to process job requests. Upon the reboot, the jobs now seem to be processing normally. We are monitoring the situation closely.Jun 14, 18:25 UTC Investigating - We are currently seeing delays for builds on private repositories. We are currently escalating the issue with one of our 3rd party provider and will post an update as soon as we know more. Thank you for your patience!

Travis-CI
Build delays for private repositories

Jun 14, 20:03 UTC Update - Private travis-ci.com builds for mac and sudo-enabled trusty/precise are running at full capacity, thank you for your patience as we work through the remaining backlogJun 14, 19:21 UTC Update - The backlog on our container-based Precise infrastructure (i.e. sudo: false + dist: precise) is now cleared.Jun 14, 19:17 UTC Update - The backlog on our container-based Trusty infrastructure (i.e. sudo: false + dist: trusty) is now cleared.Jun 14, 19:15 UTC Monitoring - We are seeing a downward trend in the backlogs of all our infrastructures. We will update when they have cleared. Thank you.Jun 14, 18:48 UTC Identified - We have proceeded to restart a component that was failing to process job requests. Upon the reboot, the jobs now seem to be processing normally. We are monitoring the situation closely.Jun 14, 18:25 UTC Investigating - We are currently seeing delays for builds on private repositories. We are currently escalating the issue with one of our 3rd party provider and will post an update as soon as we know more. Thank you for your patience!

Travis-CI
Build delays for private repositories

Jun 14, 19:21 UTC Update - The backlog on our container-based Precise infrastructure (i.e. sudo: false + dist: precise) is now cleared.Jun 14, 19:17 UTC Update - The backlog on our container-based Trusty infrastructure (i.e. sudo: false + dist: trusty) is now cleared.Jun 14, 19:15 UTC Monitoring - We are seeing a downward trend in the backlogs of all our infrastructures. We will update when they have cleared. Thank you.Jun 14, 18:48 UTC Identified - We have proceeded to restart a component that was failing to process job requests. Upon the reboot, the jobs now seem to be processing normally. We are monitoring the situation closely.Jun 14, 18:25 UTC Investigating - We are currently seeing delays for builds on private repositories. We are currently escalating the issue with one of our 3rd party provider and will post an update as soon as we know more. Thank you for your patience!

Travis-CI
Build delays for private repositories

Jun 14, 19:17 UTC Update - The backlog on our container-based Trusty infrastructure (i.e. sudo: false + dist: trusty) is now cleared.Jun 14, 19:15 UTC Monitoring - We are seeing a downward trend in the backlogs of all our infrastructures. We will update when they have cleared. Thank you.Jun 14, 18:48 UTC Identified - We have proceeded to restart a component that was failing to process job requests. Upon the reboot, the jobs now seem to be processing normally. We are monitoring the situation closely.Jun 14, 18:25 UTC Investigating - We are currently seeing delays for builds on private repositories. We are currently escalating the issue with one of our 3rd party provider and will post an update as soon as we know more. Thank you for your patience!

Travis-CI
Build delays for private repositories

Jun 14, 19:15 UTC Monitoring - We are seeing a downward trend in the backlogs of all our infrastructures. We will update when they have cleared. Thank you.Jun 14, 18:48 UTC Identified - We have proceeded to restart a component that was failing to process job requests. Upon the reboot, the jobs now seem to be processing normally. We are monitoring the situation closely.Jun 14, 18:25 UTC Investigating - We are currently seeing delays for builds on private repositories. We are currently escalating the issue with one of our 3rd party provider and will post an update as soon as we know more. Thank you for your patience!

Travis-CI
Build delays for private repositories

Jun 14, 18:48 UTC Identified - We have proceeded to restart a component that was failing to process job requests. Upon the reboot, the jobs now seem to be processing normally. We are monitoring the situation closely.Jun 14, 18:25 UTC Investigating - We are currently seeing delays for builds on private repositories. We are currently escalating the issue with one of our 3rd party provider and will post an update as soon as we know more. Thank you for your patience!

Travis-CI
Build delays for private repositories

Jun 14, 18:25 UTC Investigating - We are currently seeing delays for builds on private repositories. We are currently escalating the issue with one of our 3rd party provider and will post an update as soon as we know more. Thank you for your patience!

Travis-CI
Database upgrade on travis-ci.com

Jun 1, 16:31 UTC Scheduled - We are performing some scheduled maintenance on travis-ci.com on Friday, June 2, 2017 From 07.00 to 08.00 AM UTC . We expect the travis-ci.com API and web interface to be unavailable for some of that time window. Processing of private builds is also expected to be delayed. Open-source builds (travis-ci.org) are unaffected by this maintenance.

Travis-CI
Database upgrade on travis-ci.com

Jun 2, 06:40 UTC Completed - The maintenance is complete, thanks for bearing with us! 💛Jun 2, 06:00 UTC In progress - We are beginning our scheduled maintenance on travis-ci.com.Jun 1, 16:31 UTC Scheduled - We are performing some scheduled maintenance on travis-ci.com on Friday, June 2, 2017 From 07.00 to 08.00 AM UTC . We expect the travis-ci.com API and web interface to be unavailable for some of that time window. Processing of private builds is also expected to be delayed. Open-source builds (travis-ci.org) are unaffected by this maintenance.

Travis-CI
Database upgrade on travis-ci.com

Jun 2, 06:00 UTC In progress - We are beginning our scheduled maintenance on travis-ci.com.Jun 1, 16:31 UTC Scheduled - We are performing some scheduled maintenance on travis-ci.com on Friday, June 2, 2017 From 07.00 to 08.00 AM UTC . We expect the travis-ci.com API and web interface to be unavailable for some of that time window. Processing of private builds is also expected to be delayed. Open-source builds (travis-ci.org) are unaffected by this maintenance.

Travis-CI
travis-ci.com partially unavailable

Jun 1, 03:07 UTC Resolved - Full operation has been restored. Part of the resolution required purging the automatic daily GitHub sync queue backlog. Manual GitHub sync remains available, and automatic daily GitHub sync will trigger again within the next 18 hours. Thank you for your patience!Jun 1, 01:23 UTC Update - Most of the GitHub sync queues have caught up, with the exception of automatic daily sync. Overall database load remains higher than usual while working through the backlog. We are planning to address this with some changes to database indexes within the next day.May 31, 18:36 UTC Update - We have resumed GitHub syncing at reduced scale.May 31, 16:15 UTC Update - GitHub syncing has been temporarily disabled while we stabilize things.May 31, 15:42 UTC Monitoring - Our API service has recovered and is operating normally. We are continuing to monitor the issue.May 31, 15:27 UTC Investigating - Travis CI for private projects (https://travis-ci.com) is currently partially unavailable as our API is currently seeing an elevated number of errors.

Travis-CI
travis-ci.com partially unavailable

Jun 1, 01:23 UTC Update - Most of the GitHub sync queues have caught up, with the exception of automatic daily sync. Overall database load remains higher than usual while working through the backlog. We are planning to address this with some changes to database indexes within the next day.May 31, 18:36 UTC Update - We have resumed GitHub syncing at reduced scale.May 31, 16:15 UTC Update - GitHub syncing has been temporarily disabled while we stabilize things.May 31, 15:42 UTC Monitoring - Our API service has recovered and is operating normally. We are continuing to monitor the issue.May 31, 15:27 UTC Investigating - Travis CI for private projects (https://travis-ci.com) is currently partially unavailable as our API is currently seeing an elevated number of errors.

Travis-CI
travis-ci.com partially unavailable

May 31, 18:36 UTC Update - We have resumed GitHub syncing at reduced scale.May 31, 16:15 UTC Update - GitHub syncing has been temporarily disabled while we stabilize things.May 31, 15:42 UTC Monitoring - Our API service has recovered and is operating normally. We are continuing to monitor the issue.May 31, 15:27 UTC Investigating - Travis CI for private projects (https://travis-ci.com) is currently partially unavailable as our API is currently seeing an elevated number of errors.

Travis-CI
travis-ci.com partially unavailable

May 31, 16:15 UTC Update - GitHub syncing has been temporarily disabled while we stabilize things.May 31, 15:42 UTC Monitoring - Our API service has recovered and is operating normally. We are continuing to monitor the issue.May 31, 15:27 UTC Investigating - Travis CI for private projects (https://travis-ci.com) is currently partially unavailable as our API is currently seeing an elevated number of errors.

Travis-CI
Build delays - GitHub API Latency

May 31, 16:13 UTC Resolved - Open-source backlogs have been processed, builds are processing normally.May 31, 15:13 UTC Update - Upstream has recovered, and we have completed processing of our backlogs for incoming builds and github status updates. Private builds should no longer see any delays. We are working through the job backlog for open-source builds, which are still experiencing some delays. Thanks for your patience! 💛May 31, 14:28 UTC Update - We are still seeing elevated error levels, and we are scaling out capacity to work through the backlog more quickly.May 31, 13:27 UTC Monitoring - Please bear with us as we scale out for demand due to GitHub api latency. Short wait times for builds and delayed notifications are expected.

Travis-CI
travis-ci.com partially unavailable

May 31, 15:42 UTC Monitoring - Our API service has recovered and is operating normally. We are continuing to monitor the issue.May 31, 15:27 UTC Investigating - Travis CI for private projects (https://travis-ci.com) is currently partially unavailable as our API is currently seeing an elevated number of errors.

Travis-CI
travis-ci.com partially unavailable

May 31, 15:27 UTC Investigating - Travis CI for private projects (https://travis-ci.com) is currently partially unavailable as our API is currently seeing an elevated number of errors.

Travis-CI
Build delays - GitHub API Latency

May 31, 15:13 UTC Update - Upstream has recovered, and we have completed processing of our backlogs for incoming builds and github status updates. Private builds should no longer see any delays. We are working through the job backlog for open-source builds, which are still experiencing some delays. Thanks for your patience! 💛May 31, 14:28 UTC Update - We are still seeing elevated error levels, and we are scaling out capacity to work through the backlog more quickly.May 31, 13:27 UTC Monitoring - Please bear with us as we scale out for demand due to GitHub api latency. Short wait times for builds and delayed notifications are expected.

Travis-CI
Build delays - GitHub API Latency

May 31, 14:28 UTC Update - We are still seeing elevated error levels, and we are scaling out capacity to work through the backlog more quickly.May 31, 13:27 UTC Monitoring - Please bear with us as we scale out for demand due to GitHub api latency. Short wait times for builds and delayed notifications are expected.

Travis-CI
Build delays - GitHub API Latency

May 31, 13:27 UTC Monitoring - Please bear with us as we scale out for demand due to GitHub api latency. Short wait times for builds and delayed notifications are expected.

Travis-CI
Build delays for open source builds

May 31, 13:27 UTC Monitoring - Please bear with us as we scale out for demand due to GitHub api latency. Short wait times for builds and delayed notifications are expected.

Travis-CI
Build delays for open source builds

May 31, 13:27 UTC Monitoring - Please bear with us as we scale out for demand due to GitHub api latency. Short wait times for builds are expected.

Travis-CI
Delays for `sudo: required` builds on both .com and .org

May 31, 13:28 UTC Resolved - The network error rates have returned to normal, low levels, despite the fact that we have yet to identify the contributing factors with the help of Google support. Thank you again for your patience.May 31, 12:12 UTC Update - We are still working with Google Cloud Engine to get to the source of SSH Timeouts, users may continue to experience longer-than-normal wait times for sudo required builds.May 30, 23:13 UTC Update - We are continuing to work with Google support to identify the factors contributing to SSH timeouts. We will publish another update when new information is available. Thank you again for your patience.May 30, 21:26 UTC Update - We are continuing to work with Google support to identify the factors contributing to SSH timeouts. Thank you for your patience!May 30, 18:19 UTC Investigating - We are currently seeing an elevated number of `sudo: required` builds getting re-queued on GCE which is causing delays affecting both private and public builds. We are escalating the issue with Google's support. We will post an update when we know more. Thank you for your patience!

Travis-CI
Build delays for open source builds

May 31, 13:27 UTC Monitoring - Please bear with us as we scale out for demand due to github api latency. Short wait times for builds on our open source .org are expected.

Travis-CI
Delays for `sudo: required` builds on both .com and .org

May 31, 12:12 UTC Update - We are still working with Google Cloud Engine to get to the source of SSH Timeouts, users may continue to experience longer-than-normal wait times for sudo required builds.May 30, 23:13 UTC Update - We are continuing to work with Google support to identify the factors contributing to SSH timeouts. We will publish another update when new information is available. Thank you again for your patience.May 30, 21:26 UTC Update - We are continuing to work with Google support to identify the factors contributing to SSH timeouts. Thank you for your patience!May 30, 18:19 UTC Investigating - We are currently seeing an elevated number of `sudo: required` builds getting re-queued on GCE which is causing delays affecting both private and public builds. We are escalating the issue with Google's support. We will post an update when we know more. Thank you for your patience!

Travis-CI
Delays for `sudo: required` builds on both .com and .org

May 30, 23:13 UTC Update - We are continuing to work with Google support to identify the factors contributing to SSH timeouts. We will publish another update when new information is available. Thank you again for your patience.May 30, 21:26 UTC Update - We are continuing to work with Google support to identify the factors contributing to SSH timeouts. Thank you for your patience!May 30, 18:19 UTC Investigating - We are currently seeing an elevated number of `sudo: required` builds getting re-queued on GCE which is causing delays affecting both private and public builds. We are escalating the issue with Google's support. We will post an update when we know more. Thank you for your patience!

Travis-CI
Delays for `sudo: required` builds on both .com and .org

May 30, 23:13 UTC Update - We are continuing to work with Google support to identify the factors contributing to SSH timeouts. We will publish another update when new information is available. Thank you again for your patience.May 30, 21:26 UTC Update - We are continuing to work with Google support to identify the factors contributing to SSH timeouts. Thank you for your patience!May 30, 18:19 UTC Investigating - We are currently seeing an elevated number of `sudo: required` builds getting re-queued on GCE which is causing delays affecting both private and public builds. We are escalating the issue with Google's support. We will post an update when we know more. Thank you for your patience!

Travis-CI
Delays for `sudo: required` builds on both .com and .org

May 30, 21:26 UTC Update - We are continuing to work with Google support to identify the factors contributing to SSH timeouts. Thank you for your patience!May 30, 18:19 UTC Investigating - We are currently seeing an elevated number of `sudo: required` builds getting re-queued on GCE which is causing delays affecting both private and public builds. We are escalating the issue with Google's support. We will post an update when we know more. Thank you for your patience!

Travis-CI
Delays for `sudo: required` builds on both .com and .org

May 30, 18:19 UTC Investigating - We are currently seeing an elevated number of `sudo: required` builds getting re-queued on GCE which is causing delays affecting both private and public builds. We are escalating the issue with Google's support. We will post an update when we know more. Thank you for your patience!

Travis-CI
Logs delayed for .com builds

May 22, 23:55 UTC Resolved - Logs are now processing normally. Thanks for your patience!May 22, 23:48 UTC Investigating - We are investigating a delay in processing logs for paid, closed-source builds.

Travis-CI
Logs delayed for .com builds

May 22, 23:48 UTC Investigating - We are investigating a delay in processing logs for paid, closed-source builds.

Travis-CI
macOS Infrastructure Network Improvements

May 21, 20:51 UTC Completed - The network maintenance is complete. Thank you for your patience, and happy building!May 21, 20:37 UTC Verifying - The infrastructure provider has installed the HA pair, and we are checking to make sure that everything is still working as intended.May 21, 20:12 UTC In progress - macOS builds have been halted in preparation for network maintenance. We are now waiting on our infrastructure provider to install the HA router pair.May 21, 20:03 UTC Update - macOS builds will be halted shortly in preparation for the network maintenance on the macOS build infrastructure.May 16, 01:32 UTC Scheduled - Job processing on the macOS will be stopped for a time in order for us to install a high-availability router pair in place of our single router. This will allow us to provide more stability going forward.

Travis-CI
macOS Infrastructure Network Improvements

May 21, 20:37 UTC Verifying - The infrastructure provider has installed the HA pair, and we are checking to make sure that everything is still working as intended.May 21, 20:12 UTC In progress - macOS builds have been halted in preparation for network maintenance. We are now waiting on our infrastructure provider to install the HA router pair.May 21, 20:03 UTC Update - macOS builds will be halted shortly in preparation for the network maintenance on the macOS build infrastructure.May 16, 01:32 UTC Scheduled - Job processing on the macOS will be stopped for a time in order for us to install a high-availability router pair in place of our single router. This will allow us to provide more stability going forward.

Travis-CI
macOS Infrastructure Network Improvements

May 21, 20:12 UTC In progress - macOS builds have been halted in preparation for network maintenance. We are now waiting on our infrastructure provider to install the HA router pair.May 21, 20:03 UTC Update - macOS builds will be halted shortly in preparation for the network maintenance on the macOS build infrastructure.May 16, 01:32 UTC Scheduled - Job processing on the macOS will be stopped for a time in order for us to install a high-availability router pair in place of our single router. This will allow us to provide more stability going forward.

Travis-CI
macOS Infrastructure Network Improvements

May 16, 01:32 UTC Scheduled - Job processing on the macOS will be stopped for a time in order for us to install a high-availability router pair in place of our single router. This will allow us to provide more stability going forward.

Travis-CI
Private builds aren't starting properly

May 18, 20:55 UTC Resolved - Jobs and logs are processing normally.May 18, 20:17 UTC Monitoring - We are happy to report that there's no backlog on our sudo-enabled infrastructure (i.e. GCE) at this time. There are still backlogs on our container-based and Mac infrastructures.May 18, 20:04 UTC Update - We are restarting to process builds and are seeing the backlog going down. We are monitoring to see that everything is back to normal. Thank you!May 18, 19:45 UTC Identified - We've identified that a network issue of some kind interrupted connections to our RabbitMQ cluster and we're in the process of restarting backend services that have been left in an error state due to the interruption. We'll provide another update as we confirm we're processing builds properly again.May 18, 19:11 UTC Investigating - We are seeing reports of builds not starting for private repositories. We are currently looking into it. Thank you for your patience!

Travis-CI
Private builds aren't starting properly

May 18, 20:17 UTC Monitoring - We are happy to report that there's no backlog on our sudo-enabled infrastructure (i.e. GCE) at this time. There are still backlogs on our container-based and Mac infrastructures.May 18, 20:04 UTC Update - We are restarting to process builds and are seeing the backlog going down. We are monitoring to see that everything is back to normal. Thank you!May 18, 19:45 UTC Identified - We've identified that a network issue of some kind interrupted connections to our RabbitMQ cluster and we're in the process of restarting backend services that have been left in an error state due to the interruption. We'll provide another update as we confirm we're processing builds properly again.May 18, 19:11 UTC Investigating - We are seeing reports of builds not starting for private repositories. We are currently looking into it. Thank you for your patience!

Travis-CI
Private builds aren't starting properly

May 18, 20:04 UTC Update - We are restarting to process builds and are seeing the backlog going down. We are monitoring to see that everything is back to normal. Thank you!May 18, 19:45 UTC Identified - We've identified that a network issue of some kind interrupted connections to our RabbitMQ cluster and we're in the process of restarting backend services that have been left in an error state due to the interruption. We'll provide another update as we confirm we're processing builds properly again.May 18, 19:11 UTC Investigating - We are seeing reports of builds not starting for private repositories. We are currently looking into it. Thank you for your patience!

Travis-CI
Private builds aren't starting properly

May 18, 19:45 UTC Identified - We've identified that a network issue of some kind interrupted connections to our RabbitMQ cluster and we're in the process of restarting backend services that have been left in an error state due to the interruption. We'll provide another update as we confirm we're processing builds properly again.May 18, 19:11 UTC Investigating - We are seeing reports of builds not starting for private repositories. We are currently looking into it. Thank you for your patience!

Travis-CI
Private builds aren't starting properly

May 18, 19:11 UTC Investigating - We are seeing reports of builds not starting for private repositories. We are currently looking into it. Thank you for your patience!

Travis-CI
Private builds aren't starting properly

May 18, 19:11 UTC Investigating - We are seeing reports of builds not starting for private repositories. We are currently looking into it. Thank you for your patience!

Travis-CI
Delays for private builds on travis-ci.com

May 8, 22:24 UTC Resolved - This incident has been resolved.May 8, 21:17 UTC Update - Backlogs are clear on all Linux queues. We are still working through a backlog for Mac jobs.May 8, 20:40 UTC Update - Backlogs are clear on all container-based Linux queues. We have brought additional capacity online for sudo-enabled Linux. Backlogs remain on: - Mac - sudo-enabled Precise and TrustyMay 8, 19:59 UTC Monitoring - All infrastructure are now processing builds. The container-based infrastructure running Precise doesn't have a backlog right now. A backlog remains, however, for the following infrastructures: - Mac - container-based infrastructure (Trusty) - sudo-enabled Precise and Trusty We will give another update on the state of the backlog for these infrastructures soon.May 8, 19:36 UTC Update - Our RabbitMQ cluster has been restored. We are now restarting our workers to start processing builds again.May 8, 19:14 UTC Update - We are currently working with our RabbitMQ provider to help us get our cluster back up. We are sorry for the continued troubles.May 8, 18:33 UTC Update - We are still having issues with the connectivity between our components. We need to proceed to an emergency maintenance of our RabbitMQ cluster to be able to fix this issue. We thank you again for your patience.May 8, 17:44 UTC Update - We are still working on bringing back our build infrastructures and build logs are still unavailable. We have proceeded to restart one of our components and are currently assessing the resulting situation. We are sorry for the continued troubles.May 8, 16:34 UTC Update - Full build processing capacity has been restored on our Mac infrastructure. Other infrastructures should be back on their feet shortly. Thanks for hanging in there.May 8, 16:19 UTC Investigating - We are currently seeing at backlog of private builds on travis-ci.com. Hence, you might see delays before your builds start. Our Engineering Team is currently looking into it. We'll update this incident when we know more. Thank you for your enduring patience.

Travis-CI
Delays for private builds on travis-ci.com

May 8, 21:17 UTC Update - Backlogs are clear on all Linux queues. We are still working through a backlog for Mac jobs.May 8, 20:40 UTC Update - Backlogs are clear on all container-based Linux queues. We have brought additional capacity online for sudo-enabled Linux. Backlogs remain on: - Mac - sudo-enabled Precise and TrustyMay 8, 19:59 UTC Monitoring - All infrastructure are now processing builds. The container-based infrastructure running Precise doesn't have a backlog right now. A backlog remains, however, for the following infrastructures: - Mac - container-based infrastructure (Trusty) - sudo-enabled Precise and Trusty We will give another update on the state of the backlog for these infrastructures soon.May 8, 19:36 UTC Update - Our RabbitMQ cluster has been restored. We are now restarting our workers to start processing builds again.May 8, 19:14 UTC Update - We are currently working with our RabbitMQ provider to help us get our cluster back up. We are sorry for the continued troubles.May 8, 18:33 UTC Update - We are still having issues with the connectivity between our components. We need to proceed to an emergency maintenance of our RabbitMQ cluster to be able to fix this issue. We thank you again for your patience.May 8, 17:44 UTC Update - We are still working on bringing back our build infrastructures and build logs are still unavailable. We have proceeded to restart one of our components and are currently assessing the resulting situation. We are sorry for the continued troubles.May 8, 16:34 UTC Update - Full build processing capacity has been restored on our Mac infrastructure. Other infrastructures should be back on their feet shortly. Thanks for hanging in there.May 8, 16:19 UTC Investigating - We are currently seeing at backlog of private builds on travis-ci.com. Hence, you might see delays before your builds start. Our Engineering Team is currently looking into it. We'll update this incident when we know more. Thank you for your enduring patience.

Travis-CI
Delays for private builds on travis-ci.com

May 8, 20:40 UTC Update - Backlogs are clear on all container-based Linux queues. We have brought additional capacity online for sudo-enabled Linux. Backlogs remain on: - Mac - sudo-enabled Precise and TrustyMay 8, 19:59 UTC Monitoring - All infrastructure are now processing builds. The container-based infrastructure running Precise doesn't have a backlog right now. A backlog remains, however, for the following infrastructures: - Mac - container-based infrastructure (Trusty) - sudo-enabled Precise and Trusty We will give another update on the state of the backlog for these infrastructures soon.May 8, 19:36 UTC Update - Our RabbitMQ cluster has been restored. We are now restarting our workers to start processing builds again.May 8, 19:14 UTC Update - We are currently working with our RabbitMQ provider to help us get our cluster back up. We are sorry for the continued troubles.May 8, 18:33 UTC Update - We are still having issues with the connectivity between our components. We need to proceed to an emergency maintenance of our RabbitMQ cluster to be able to fix this issue. We thank you again for your patience.May 8, 17:44 UTC Update - We are still working on bringing back our build infrastructures and build logs are still unavailable. We have proceeded to restart one of our components and are currently assessing the resulting situation. We are sorry for the continued troubles.May 8, 16:34 UTC Update - Full build processing capacity has been restored on our Mac infrastructure. Other infrastructures should be back on their feet shortly. Thanks for hanging in there.May 8, 16:19 UTC Investigating - We are currently seeing at backlog of private builds on travis-ci.com. Hence, you might see delays before your builds start. Our Engineering Team is currently looking into it. We'll update this incident when we know more. Thank you for your enduring patience.

Travis-CI
Delays for private builds on travis-ci.com

May 8, 19:59 UTC Monitoring - All infrastructure are now processing builds. The container-based infrastructure running Precise doesn't have a backlog right now. A backlog remains, however, for the following infrastructures: - Mac - container-based infrastructure (Trusty) - sudo-enabled Precise and Trusty We will give another update on the state of the backlog for these infrastructures soon.May 8, 19:36 UTC Update - Our RabbitMQ cluster has been restored. We are now restarting our workers to start processing builds again.May 8, 19:14 UTC Update - We are currently working with our RabbitMQ provider to help us get our cluster back up. We are sorry for the continued troubles.May 8, 18:33 UTC Update - We are still having issues with the connectivity between our components. We need to proceed to an emergency maintenance of our RabbitMQ cluster to be able to fix this issue. We thank you again for your patience.May 8, 17:44 UTC Update - We are still working on bringing back our build infrastructures and build logs are still unavailable. We have proceeded to restart one of our components and are currently assessing the resulting situation. We are sorry for the continued troubles.May 8, 16:34 UTC Update - Full build processing capacity has been restored on our Mac infrastructure. Other infrastructures should be back on their feet shortly. Thanks for hanging in there.May 8, 16:19 UTC Investigating - We are currently seeing at backlog of private builds on travis-ci.com. Hence, you might see delays before your builds start. Our Engineering Team is currently looking into it. We'll update this incident when we know more. Thank you for your enduring patience.

Travis-CI
Delays for private builds on travis-ci.com

May 8, 19:36 UTC Update - Our RabbitMQ cluster has been restored. We are now restarting our workers to start processing builds again.May 8, 19:14 UTC Update - We are currently working with our RabbitMQ provider to help us get our cluster back up. We are sorry for the continued troubles.May 8, 18:33 UTC Update - We are still having issues with the connectivity between our components. We need to proceed to an emergency maintenance of our RabbitMQ cluster to be able to fix this issue. We thank you again for your patience.May 8, 17:44 UTC Update - We are still working on bringing back our build infrastructures and build logs are still unavailable. We have proceeded to restart one of our components and are currently assessing the resulting situation. We are sorry for the continued troubles.May 8, 16:34 UTC Update - Full build processing capacity has been restored on our Mac infrastructure. Other infrastructures should be back on their feet shortly. Thanks for hanging in there.May 8, 16:19 UTC Investigating - We are currently seeing at backlog of private builds on travis-ci.com. Hence, you might see delays before your builds start. Our Engineering Team is currently looking into it. We'll update this incident when we know more. Thank you for your enduring patience.

Travis-CI
Delays for private builds on travis-ci.com

May 8, 19:14 UTC Update - We are currently working with our RabbitMQ provider to help us get our cluster back up. We are sorry for the continued troubles.May 8, 18:33 UTC Update - We are still having issues with the connectivity between our components. We need to proceed to an emergency maintenance of our RabbitMQ cluster to be able to fix this issue. We thank you again for your patience.May 8, 17:44 UTC Update - We are still working on bringing back our build infrastructures and build logs are still unavailable. We have proceeded to restart one of our components and are currently assessing the resulting situation. We are sorry for the continued troubles.May 8, 16:34 UTC Update - Full build processing capacity has been restored on our Mac infrastructure. Other infrastructures should be back on their feet shortly. Thanks for hanging in there.May 8, 16:19 UTC Investigating - We are currently seeing at backlog of private builds on travis-ci.com. Hence, you might see delays before your builds start. Our Engineering Team is currently looking into it. We'll update this incident when we know more. Thank you for your enduring patience.

Travis-CI
Delays for private builds on travis-ci.com

May 8, 19:14 UTC Update - We are currently working with our RabbitMQ provider to help us get our cluster back up. We are sorry for the continued troubles.May 8, 18:33 UTC Update - We are still having issues with the connectivity between our components. We need to proceed to an emergency maintenance of our RabbitMQ cluster to be able to fix this issue. We are thanking you again for your patience.May 8, 17:44 UTC Update - We are still working on bringing back our build infrastructures and build logs are still unavailable. We have proceeded to restart one of our components and are currently assessing the resulting situation. We are sorry for the continued troubles.May 8, 16:34 UTC Update - Full build processing capacity has been restored on our Mac infrastructure. Other infrastructures should be back on their feet shortly. Thanks for hanging in there.May 8, 16:19 UTC Investigating - We are currently seeing at backlog of private builds on travis-ci.com. Hence, you might see delays before your builds start. Our Engineering Team is currently looking into it. We'll update this incident when we know more. Thank you for your enduring patience.

Travis-CI
Delays for private builds on travis-ci.com

May 8, 18:33 UTC Update - We are still having issues with the connectivity between our components. We need to proceed to an emergency maintenance of our RabbitMQ cluster to be able to fix this issue. We are thanking you again for your patience.May 8, 17:44 UTC Update - We are still working on bringing back our build infrastructures and build logs are still unavailable. We have proceeded to restart one of our components and are currently assessing the resulting situation. We are sorry for the continued troubles.May 8, 16:34 UTC Update - Full build processing capacity has been restored on our Mac infrastructure. Other infrastructures should be back on their feet shortly. Thanks for hanging in there.May 8, 16:19 UTC Investigating - We are currently seeing at backlog of private builds on travis-ci.com. Hence, you might see delays before your builds start. Our Engineering Team is currently looking into it. We'll update this incident when we know more. Thank you for your enduring patience.

Travis-CI
Delays for private builds on travis-ci.com

May 8, 17:44 UTC Update - We are still working on bringing back our build infrastructures and build logs are still unavailable. We have proceeded to restart one of our components and are currently assessing the resulting situation. We are sorry for the continued troubles.May 8, 16:34 UTC Update - Full build processing capacity has been restored on our Mac infrastructure. Other infrastructures should be back on their feet shortly. Thanks for hanging in there.May 8, 16:19 UTC Investigating - We are currently seeing at backlog of private builds on travis-ci.com. Hence, you might see delays before your builds start. Our Engineering Team is currently looking into it. We'll update this incident when we know more. Thank you for your enduring patience.

Travis-CI
Delays for private builds on travis-ci.com

May 8, 16:34 UTC Update - Full build processing capacity has been restored on our Mac infrastructure. Other infrastructures should be back on their feet shortly. Thanks for hanging in there.May 8, 16:19 UTC Investigating - We are currently seeing at backlog of private builds on travis-ci.com. Hence, you might see delays before your builds start. Our Engineering Team is currently looking into it. We'll update this incident when we know more. Thank you for your enduring patience.

Travis-CI
Delays for private builds on travis-ci.com

May 8, 16:19 UTC Investigating - We are currently seeing at backlog of private builds on travis-ci.com. Hence, you might see delays before your builds start. Our Engineering Team is currently looking into it. We'll update this incident when we know more. Thank you for your enduring patience.

Travis-CI
Build logs are currently missing for private builds on travis-ci.com

May 8, 16:17 UTC Resolved - Logs should now be displaying properly. Please reach out at support [at] travis-ci [dot] com if it's not the case. Thank you! 💛May 8, 16:05 UTC Monitoring - We've been able to put back the display of build logs back on its feet. We are monitoring things to see if everything is working correctly now.May 8, 15:37 UTC Investigating - We are currently seeing reports of build logs being unavailable for private builds on travis-ci.com. Our Engineering Team is currently looking into it. We will update this incident as soon as we know more. Thank you for your patience!

Travis-CI
Build logs are currently missing for private builds on travis-ci.com

May 8, 16:05 UTC Monitoring - We've been able to put back the display of build logs back on its feet. We are monitoring things to see if everything is working correctly now.May 8, 15:37 UTC Investigating - We are currently seeing reports of build logs being unavailable for private builds on travis-ci.com. Our Engineering Team is currently looking into it. We will update this incident as soon as we know more. Thank you for your patience!

Travis-CI
Build logs are currently missing for private builds on travis-ci.com

May 8, 15:37 UTC Investigating - We are currently seeing reports of build logs being unavailable for private builds on travis-ci.com. Our Engineering Team is currently looking into it. We will update this incident as soon as we know more. Thank you for your patience!

Travis-CI
Syncing user data with GitHub is delayed on travis-ci.org

May 7, 11:37 UTC Resolved - This incident has been resolved.May 7, 11:14 UTC Monitoring - A fix has been implemented and we are monitoring the results.May 7, 08:39 UTC Investigating - We are currently investigating this issue.

Travis-CI
Syncing user data with GitHub is delayed on travis-ci.org

May 7, 11:14 UTC Monitoring - A fix has been implemented and we are monitoring the results.May 7, 08:39 UTC Investigating - We are currently investigating this issue.

Travis-CI
Syncing user data with GitHub is delayed on travis-ci.org

May 7, 08:39 UTC Investigating - We are currently investigating this issue.

Travis-CI
Gem installation from rubygems.org fails on our sudo-less Precise container for both private and public builds

May 3, 00:33 UTC Resolved - The issue has been resolved.May 2, 21:53 UTC Monitoring - Gem installation should now be working. Please restart the affected builds. Thank you!May 2, 21:02 UTC Identified - We identified the issue as being unable to establish TLS handshake with https://rubygems.org with anything less than TLS v1.2. We are working with rubygems.org support and their upstream service provider to resolve this issue.May 2, 21:01 UTC Update - A side effect of this issue is that we are unable to install `dpl`, which is required for deployment.May 2, 20:31 UTC Investigating - We've received reports of gem installation failures on builds on our Precise container i.e. for builds with ``` sudo: false dist: precise # this may or may not exist ``` We are currently publicly tracking this issue here: https://github.com/travis-ci/travis-ci/issues/7685. In the meantime, we can suggest to route your builds on our sudo-enabled Precise infrastructure with the following in your .travis.yml file: ``` sudo: required dist: precise ``` Thank you for your patience!

Travis-CI
Bundler isn't found on our sudo-less Precise container for both private and public builds

May 2, 21:53 UTC Monitoring - Gem installation should now be working. Please restart the affected builds. Thank you!May 2, 21:02 UTC Identified - We identified the issue as being unable to establish TLS handshake with https://rubygems.org with anything less than TLS v1.2. We are working with rubygems.org support and their upstream service provider to resolve this issue.May 2, 21:01 UTC Update - A side effect of this issue is that we are unable to install `dpl`, which is required for deployment.May 2, 20:31 UTC Investigating - We've received reports of gem installation failures on builds on our Precise container i.e. for builds with ``` sudo: false dist: precise # this may or may not exist ``` We are currently publicly tracking this issue here: https://github.com/travis-ci/travis-ci/issues/7685. In the meantime, we can suggest to route your builds on our sudo-enabled Precise infrastructure with the following in your .travis.yml file: ``` sudo: required dist: precise ``` Thank you for your patience!

Travis-CI
Bundler isn't found on our sudo-less Precise container for both private and public builds

May 2, 21:02 UTC Identified - We identified the issue as being unable to establish TLS handshake with https://rubygems.org with anything less than TLS v1.2. We are working with rubygems.org support and their upstream service provider to resolve this issue.May 2, 21:01 UTC Update - A side effect of this issue is that we are unable to install `dpl`, which is required for deployment.May 2, 20:31 UTC Investigating - We've received reports of gem installation failures on builds on our Precise container i.e. for builds with ``` sudo: false dist: precise # this may or may not exist ``` We are currently publicly tracking this issue here: https://github.com/travis-ci/travis-ci/issues/7685. In the meantime, we can suggest to route your builds on our sudo-enabled Precise infrastructure with the following in your .travis.yml file: ``` sudo: required dist: precise ``` Thank you for your patience!

Travis-CI
Bundler isn't found on our sudo-less Precise container for both private and public builds

May 2, 21:01 UTC Update - A side effect of this issue is that we are unable to install `dpl`, which is required for deployment.May 2, 20:31 UTC Investigating - We've received reports of gem installation failures on builds on our Precise container i.e. for builds with ``` sudo: false dist: precise # this may or may not exist ``` We are currently publicly tracking this issue here: https://github.com/travis-ci/travis-ci/issues/7685. In the meantime, we can suggest to route your builds on our sudo-enabled Precise infrastructure with the following in your .travis.yml file: ``` sudo: required dist: precise ``` Thank you for your patience!

Travis-CI
Bundler isn't found on our sudo-less Precise container for both private and public builds

May 2, 20:31 UTC Investigating - We've received reports telling us Bundler isn't available for builds on our Precise container i.e. for builds with sudo: false and, optionally, dist: precise We are currently publicly tracking this issue here: https://github.com/travis-ci/travis-ci/issues/7685. In the meantime, we can suggest to route your builds on our sudo-enabled Precise infrastructure with the following in your .travis.yml file: sudo: required dist: precise Thank you for your patience!

Travis-CI
Logs display issues for public repositories

Apr 27, 08:08 UTC Resolved - We identified and fixed an issue displaying logs for public repositories after the maintenance. Logs should be displaying normally now. Thanks for your patience.

Travis-CI
Logs database partition maintenance for public repos

Apr 27, 06:18 UTC Completed - The scheduled maintenance has been completed.Apr 27, 05:28 UTC Verifying - Verification is currently underway for the maintenance items.Apr 27, 05:03 UTC Update - We have extended the maintenance for another hour while waiting for an index to rebuild.Apr 27, 03:38 UTC Update - The most recently executed partitioning query is taking longer than expected, so we are extending the maintenance by 1 hour.Apr 27, 01:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Apr 26, 15:36 UTC Scheduled - We need to get our regular partition maintenance query back on track, which will require an interruption in log parts processing. During the maintenance, we expect jobs to continue running, including any deployment steps and GitHub status updates, but realtime log streaming and lookup of log output from *public* jobs that were running from the beginning of maintenance will be unavailable.

Travis-CI
Logs database partition maintenance for public repos

Apr 27, 05:28 UTC Verifying - Verification is currently underway for the maintenance items.Apr 27, 05:03 UTC Update - We have extended the maintenance for another hour while waiting for an index to rebuild.Apr 27, 03:38 UTC Update - The most recently executed partitioning query is taking longer than expected, so we are extending the maintenance by 1 hour.Apr 27, 01:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Apr 26, 15:36 UTC Scheduled - We need to get our regular partition maintenance query back on track, which will require an interruption in log parts processing. During the maintenance, we expect jobs to continue running, including any deployment steps and GitHub status updates, but realtime log streaming and lookup of log output from *public* jobs that were running from the beginning of maintenance will be unavailable.

Travis-CI
Logs database partition maintenance for public repos

Apr 27, 05:03 UTC Update - We have extended the maintenance for another hour while waiting for an index to rebuild.Apr 27, 03:38 UTC Update - The most recently executed partitioning query is taking longer than expected, so we are extending the maintenance by 1 hour.Apr 27, 01:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Apr 26, 15:36 UTC Scheduled - We need to get our regular partition maintenance query back on track, which will require an interruption in log parts processing. During the maintenance, we expect jobs to continue running, including any deployment steps and GitHub status updates, but realtime log streaming and lookup of log output from *public* jobs that were running from the beginning of maintenance will be unavailable.

Travis-CI
Logs database partition maintenance for public repos

Apr 27, 03:38 UTC Update - The most recently executed partitioning query is taking longer than expected, so we are extending the maintenance by 1 hour.Apr 27, 01:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Apr 26, 15:36 UTC Scheduled - We need to get our regular partition maintenance query back on track, which will require an interruption in log parts processing. During the maintenance, we expect jobs to continue running, including any deployment steps and GitHub status updates, but realtime log streaming and lookup of log output from *public* jobs that were running from the beginning of maintenance will be unavailable.

Travis-CI
Logs database partition maintenance for public repos

Apr 27, 01:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Apr 26, 15:36 UTC Scheduled - We need to get our regular partition maintenance query back on track, which will require an interruption in log parts processing. During the maintenance, we expect jobs to continue running, including any deployment steps and GitHub status updates, but realtime log streaming and lookup of log output from *public* jobs that were running from the beginning of maintenance will be unavailable.

Travis-CI
Logs database partition maintenance for public repos

Apr 27, 01:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Apr 26, 15:36 UTC Scheduled - We need to get our regular partition maintenance query back on track, which will require an interruption in log parts processing. During the maintenance, we expect jobs to continue running, including any deployment steps and GitHub status updates, but realtime log streaming and lookup of log output from *public* jobs that were running from the beginning of maintenance will be unavailable.

Travis-CI
Logs database partition maintenance for public repos

Apr 26, 15:36 UTC Scheduled - We need to get our regular partition maintenance query back on track, which will require an interruption in log parts processing. During the maintenance, we expect jobs to continue running, including any deployment steps and GitHub status updates, but realtime log streaming and lookup of log output from *public* jobs that were running from the beginning of maintenance will be unavailable.

Travis-CI
Delays in processing builds for container-based private repositories

Apr 25, 12:54 UTC Resolved - The backlog for container-based infrastructure for private builds has cleared. Thank you for your patience.Apr 25, 11:23 UTC Identified - Surge in demand for our docker-based infrastructure for private repositories has caused a small backlog. We will scale up manually to process the backlog more quickly. We apologize for build wait times.

Travis-CI
Delays in processing builds for container-based private repositories

Apr 25, 11:23 UTC Identified - Surge in demand for our docker-based infrastructure for private repositories has caused a small backlog. We will scale up manually to process the backlog more quickly. We apologize for build wait times.

Travis-CI
Database connectivity issues

Apr 25, 05:22 UTC Resolved - This incident has been resolved.Apr 25, 04:45 UTC Monitoring - A fix has been implemented and we are monitoring the results.Apr 25, 03:41 UTC Identified - Primary databases for both public and private repositories appear to have failed over to their respective standby servers. All but one application recovered automatically, and we finished reconfiguring the remaining application a few minutes ago. We are in the process of verifying behavior and checking for potential data loss.Apr 25, 03:29 UTC Investigating - We are investigating alerts that indicate issues communicating with the primary database for private repos.

Travis-CI
Database connectivity issues

Apr 25, 04:45 UTC Monitoring - A fix has been implemented and we are monitoring the results.Apr 25, 03:41 UTC Identified - Primary databases for both public and private repositories appear to have failed over to their respective standby servers. All but one application recovered automatically, and we finished reconfiguring the remaining application a few minutes ago. We are in the process of verifying behavior and checking for potential data loss.Apr 25, 03:29 UTC Investigating - We are investigating alerts that indicate issues communicating with the primary database for private repos.

Travis-CI
Database connectivity issues

Apr 25, 03:41 UTC Identified - Primary databases for both public and private repositories appear to have failed over to their respective standby servers. All but one application recovered automatically, and we finished reconfiguring the remaining application a few minutes ago. We are in the process of verifying behavior and checking for potential data loss.Apr 25, 03:29 UTC Investigating - We are investigating alerts that indicate issues communicating with the primary database for private repos.

Travis-CI
Database connectivity issues

Apr 25, 03:29 UTC Investigating - We are investigating alerts that indicate issues communicating with the primary database for private repos.

Travis-CI
Database connectivity issues for private repos

Apr 25, 03:29 UTC Investigating - We are investigating alerts that indicate issues communicating with the primary database for private repos.

Travis-CI
Logs database partition maintenance

Apr 23, 18:57 UTC Completed - The scheduled maintenance has been completed.Apr 23, 18:44 UTC Verifying - Verification is currently underway for the maintenance items.Apr 23, 18:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Apr 17, 21:34 UTC Scheduled - We are introducing the use of partitions to our logs database for travis-ci.com in order to ensure we can continue to scale the existing design. This will require degraded service while introducing a gap in logs processing, followed by a database upgrade and migration. Upon completion, we expect job processing to continue as usual and quickly catch up with the backlog that accumulates during the maintenance.

Travis-CI
Logs database partition maintenance

Apr 23, 18:44 UTC Verifying - Verification is currently underway for the maintenance items.Apr 23, 18:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Apr 17, 21:34 UTC Scheduled - We are introducing the use of partitions to our logs database for travis-ci.com in order to ensure we can continue to scale the existing design. This will require degraded service while introducing a gap in logs processing, followed by a database upgrade and migration. Upon completion, we expect job processing to continue as usual and quickly catch up with the backlog that accumulates during the maintenance.

Travis-CI
Logs database partition maintenance

Apr 23, 18:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Apr 17, 21:34 UTC Scheduled - We are introducing the use of partitions to our logs database for travis-ci.com in order to ensure we can continue to scale the existing design. This will require degraded service while introducing a gap in logs processing, followed by a database upgrade and migration. Upon completion, we expect job processing to continue as usual and quickly catch up with the backlog that accumulates during the maintenance.

Travis-CI
Logs database partition maintenance

Apr 17, 21:34 UTC Scheduled - We are introducing the use of partitions to our logs database for travis-ci.com in order to ensure we can continue to scale the existing design. This will require degraded service while introducing a gap in logs processing, followed by a database upgrade and migration. Upon completion, we expect job processing to continue as usual and quickly catch up with the backlog that accumulates during the maintenance.

Travis-CI
Logs database partition maintenance

Apr 16, 19:58 UTC Completed - The scheduled maintenance has been completed.Apr 16, 19:30 UTC Verifying - Verification is currently underway for the maintenance items.Apr 16, 18:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Apr 14, 17:57 UTC Scheduled - We are introducing the use of partitions to our logs database for travis-ci.org in order to ensure we can continue to scale the existing design. This will require degraded service while introducing a gap in logs processing, followed by a database upgrade and migration. Upon completion, we expect job processing to continue as usual and quickly catch up with the backlog that accumulates during the maintenance.

Travis-CI
Logs database partition maintenance

Apr 16, 19:30 UTC Verifying - Verification is currently underway for the maintenance items.Apr 16, 18:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Apr 14, 17:57 UTC Scheduled - We are introducing the use of partitions to our logs database for travis-ci.org in order to ensure we can continue to scale the existing design. This will require degraded service while introducing a gap in logs processing, followed by a database upgrade and migration. Upon completion, we expect job processing to continue as usual and quickly catch up with the backlog that accumulates during the maintenance.

Travis-CI
Logs database partition maintenance

Apr 16, 18:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Apr 14, 17:57 UTC Scheduled - We are introducing the use of partitions to our logs database for travis-ci.org in order to ensure we can continue to scale the existing design. This will require degraded service while introducing a gap in logs processing, followed by a database upgrade and migration. Upon completion, we expect job processing to continue as usual and quickly catch up with the backlog that accumulates during the maintenance.

Travis-CI
Logs database partition maintenance

Apr 14, 17:57 UTC Scheduled - We are introducing the use of partitions to our logs database for travis-ci.org in order to ensure we can continue to scale the existing design. This will require degraded service while introducing a gap in logs processing, followed by a database upgrade and migration. Upon completion, we expect job processing to continue as usual and quickly catch up with the backlog that accumulates during the maintenance.

Travis-CI
Delays on container-based paid builds

Apr 13, 16:14 UTC Resolved - Capacity is now back to normal on our container-based infrastructure for travis-ci.com. Thank you for your patience!Apr 13, 15:35 UTC Investigating - We are investigating delays starting container-based builds on travis-ci.com.

Travis-CI
Delays on container-based paid builds

Apr 13, 15:35 UTC Investigating - We are investigating delays starting container-based builds on travis-ci.com.

Travis-CI
travis-ci.org logs database outage.

Apr 10, 16:37 UTC Resolved - This incident has been resolved.Apr 10, 15:34 UTC Monitoring - A fix has been implemented and we are monitoring the results.Apr 10, 15:28 UTC Update - We have completed the data restoration process. We will be re-enabling mutability of the affected records shortly.Apr 7, 14:52 UTC Update - We are continuing work on restoring historical data, which we expect will take between 1-3 days. We will be updating this incident once the restoration process is complete.Mar 31, 16:12 UTC Identified - We apologize for the delay in updating this incident. Due to database infrastructure issues we have been unable to successfully extract all data from our previous primary database. Any jobs that completed prior to the incident cannot be effectively restarted for updated logs. We continue working towards restoring the relevant records. This is estimated to be a ~5 day process due to the size of the database tables. We are extremely sorry and just as frustrated with this as you probably are. We will provide another update once we have more information. Please email support@travis-ci.com if you have any questions in the mean time.Mar 30, 01:42 UTC Update - We apologize for the delay in updating this incident. Our earlier database maintenance included needing to put a new, empty logs database into production. The previous database still contains the only copy of parts of some build logs, written in the few minutes prior to when the database was no longer able to accept writes. We've been working towards extracting that data this afternoon; so that we can get it archived into S3 and make it accessible again. However, due to the effects of the original problem, we need the assistance of our database infrastructure provider in performing some actions on the previous primary database. We've escalated this with them, but do not currently have an ETA on when this work will be completed. We will provide another update once we have an update from them. Please email support@travis-ci.com if you have any questions in the mean time.Mar 29, 20:01 UTC Monitoring - If you restart jobs that completed before 18:00 UTC , you will continue to see the previous job's output or inconsistent logs displayed. We are working on a resolution for this but it may be a few hours before this is done.Mar 29, 19:50 UTC Update - We are resuming build processing. New build logs should appear, but recent build logs may be unavailable until further notice. We also expect some delay in build processing while we catch up with the back logs.Mar 29, 19:23 UTC Update - We are continuing our database maintenance.Mar 29, 18:52 UTC Update - We are performing emergency database maintenance. We hope to resume processing build requests shortly.Mar 29, 18:22 UTC Update - We are stopping accepting builds while we perform emergency maintenance.Mar 29, 18:06 UTC Identified - Logs database is having problems with writing logs. We are working with our service provider to resolve this issue.Mar 29, 18:00 UTC Investigating - We are investigating an .org logs database problem. Logs are unavailable at this time.

Travis-CI
travis-ci.org logs database outage.

Apr 10, 15:34 UTC Monitoring - A fix has been implemented and we are monitoring the results.Apr 10, 15:28 UTC Update - We have completed the data restoration process. We will be re-enabling mutability of the affected records shortly.Apr 7, 14:52 UTC Update - We are continuing work on restoring historical data, which we expect will take between 1-3 days. We will be updating this incident once the restoration process is complete.Mar 31, 16:12 UTC Identified - We apologize for the delay in updating this incident. Due to database infrastructure issues we have been unable to successfully extract all data from our previous primary database. Any jobs that completed prior to the incident cannot be effectively restarted for updated logs. We continue working towards restoring the relevant records. This is estimated to be a ~5 day process due to the size of the database tables. We are extremely sorry and just as frustrated with this as you probably are. We will provide another update once we have more information. Please email support@travis-ci.com if you have any questions in the mean time.Mar 30, 01:42 UTC Update - We apologize for the delay in updating this incident. Our earlier database maintenance included needing to put a new, empty logs database into production. The previous database still contains the only copy of parts of some build logs, written in the few minutes prior to when the database was no longer able to accept writes. We've been working towards extracting that data this afternoon; so that we can get it archived into S3 and make it accessible again. However, due to the effects of the original problem, we need the assistance of our database infrastructure provider in performing some actions on the previous primary database. We've escalated this with them, but do not currently have an ETA on when this work will be completed. We will provide another update once we have an update from them. Please email support@travis-ci.com if you have any questions in the mean time.Mar 29, 20:01 UTC Monitoring - If you restart jobs that completed before 18:00 UTC , you will continue to see the previous job's output or inconsistent logs displayed. We are working on a resolution for this but it may be a few hours before this is done.Mar 29, 19:50 UTC Update - We are resuming build processing. New build logs should appear, but recent build logs may be unavailable until further notice. We also expect some delay in build processing while we catch up with the back logs.Mar 29, 19:23 UTC Update - We are continuing our database maintenance.Mar 29, 18:52 UTC Update - We are performing emergency database maintenance. We hope to resume processing build requests shortly.Mar 29, 18:22 UTC Update - We are stopping accepting builds while we perform emergency maintenance.Mar 29, 18:06 UTC Identified - Logs database is having problems with writing logs. We are working with our service provider to resolve this issue.Mar 29, 18:00 UTC Investigating - We are investigating an .org logs database problem. Logs are unavailable at this time.

Travis-CI
travis-ci.org logs database outage.

Apr 10, 15:28 UTC Update - We have completed the data restoration process. We will be re-enabling mutability of the affected records shortly.Apr 7, 14:52 UTC Update - We are continuing work on restoring historical data, which we expect will take between 1-3 days. We will be updating this incident once the restoration process is complete.Mar 31, 16:12 UTC Identified - We apologize for the delay in updating this incident. Due to database infrastructure issues we have been unable to successfully extract all data from our previous primary database. Any jobs that completed prior to the incident cannot be effectively restarted for updated logs. We continue working towards restoring the relevant records. This is estimated to be a ~5 day process due to the size of the database tables. We are extremely sorry and just as frustrated with this as you probably are. We will provide another update once we have more information. Please email support@travis-ci.com if you have any questions in the mean time.Mar 30, 01:42 UTC Update - We apologize for the delay in updating this incident. Our earlier database maintenance included needing to put a new, empty logs database into production. The previous database still contains the only copy of parts of some build logs, written in the few minutes prior to when the database was no longer able to accept writes. We've been working towards extracting that data this afternoon; so that we can get it archived into S3 and make it accessible again. However, due to the effects of the original problem, we need the assistance of our database infrastructure provider in performing some actions on the previous primary database. We've escalated this with them, but do not currently have an ETA on when this work will be completed. We will provide another update once we have an update from them. Please email support@travis-ci.com if you have any questions in the mean time.Mar 29, 20:01 UTC Monitoring - If you restart jobs that completed before 18:00 UTC , you will continue to see the previous job's output or inconsistent logs displayed. We are working on a resolution for this but it may be a few hours before this is done.Mar 29, 19:50 UTC Update - We are resuming build processing. New build logs should appear, but recent build logs may be unavailable until further notice. We also expect some delay in build processing while we catch up with the back logs.Mar 29, 19:23 UTC Update - We are continuing our database maintenance.Mar 29, 18:52 UTC Update - We are performing emergency database maintenance. We hope to resume processing build requests shortly.Mar 29, 18:22 UTC Update - We are stopping accepting builds while we perform emergency maintenance.Mar 29, 18:06 UTC Identified - Logs database is having problems with writing logs. We are working with our service provider to resolve this issue.Mar 29, 18:00 UTC Investigating - We are investigating an .org logs database problem. Logs are unavailable at this time.

Travis-CI
Slow network connection between sudo-enabled infrastructure and Azure

Apr 8, 14:07 UTC Resolved - Network performance is back to normal.Apr 8, 00:55 UTC Monitoring - We are seeing promising reports of the improved network performance. If your builds were affected previously, please restart them to upload again.Apr 7, 14:34 UTC Investigating - We have received reports of slow network connection between jobs running on sudo-enabled infrastructure and Microsoft Azure servers. This typically manifests when the job attempts to upload Docker images and times out after 10 minutes. The problem started around 2017-04-06 01:00 UTC (about 40 hours ago). We are working with our infrastructure service provider to resolve this issue.

Travis-CI
Slow network connection between sudo-enabled infrastructure and Azure

Apr 8, 00:55 UTC Monitoring - We are seeing promising reports of the improved network performance. If your builds were affected previously, please restart them to upload again.Apr 7, 14:34 UTC Investigating - We have received reports of slow network connection between jobs running on sudo-enabled infrastructure and Microsoft Azure servers. This typically manifests when the job attempts to upload Docker images and times out after 10 minutes. The problem started around 2017-04-06 01:00 UTC (about 40 hours ago). We are working with our infrastructure service provider to resolve this issue.

Travis-CI
travis-ci.org logs database outage.

Apr 7, 14:52 UTC Update - We are continuing work on restoring historical data, which we expect will take between 1-3 days. We will be updating this incident once the restoration process is complete.Mar 31, 16:12 UTC Identified - We apologize for the delay in updating this incident. Due to database infrastructure issues we have been unable to successfully extract all data from our previous primary database. Any jobs that completed prior to the incident cannot be effectively restarted for updated logs. We continue working towards restoring the relevant records. This is estimated to be a ~5 day process due to the size of the database tables. We are extremely sorry and just as frustrated with this as you probably are. We will provide another update once we have more information. Please email support@travis-ci.com if you have any questions in the mean time.Mar 30, 01:42 UTC Update - We apologize for the delay in updating this incident. Our earlier database maintenance included needing to put a new, empty logs database into production. The previous database still contains the only copy of parts of some build logs, written in the few minutes prior to when the database was no longer able to accept writes. We've been working towards extracting that data this afternoon; so that we can get it archived into S3 and make it accessible again. However, due to the effects of the original problem, we need the assistance of our database infrastructure provider in performing some actions on the previous primary database. We've escalated this with them, but do not currently have an ETA on when this work will be completed. We will provide another update once we have an update from them. Please email support@travis-ci.com if you have any questions in the mean time.Mar 29, 20:01 UTC Monitoring - If you restart jobs that completed before 18:00 UTC , you will continue to see the previous job's output or inconsistent logs displayed. We are working on a resolution for this but it may be a few hours before this is done.Mar 29, 19:50 UTC Update - We are resuming build processing. New build logs should appear, but recent build logs may be unavailable until further notice. We also expect some delay in build processing while we catch up with the back logs.Mar 29, 19:23 UTC Update - We are continuing our database maintenance.Mar 29, 18:52 UTC Update - We are performing emergency database maintenance. We hope to resume processing build requests shortly.Mar 29, 18:22 UTC Update - We are stopping accepting builds while we perform emergency maintenance.Mar 29, 18:06 UTC Identified - Logs database is having problems with writing logs. We are working with our service provider to resolve this issue.Mar 29, 18:00 UTC Investigating - We are investigating an .org logs database problem. Logs are unavailable at this time.

Travis-CI
travis-ci.org logs database outage.

Apr 7, 14:52 UTC Update - We are continuing work on restoring historical data and will be updating this incident once the restoration process is complete.Mar 31, 16:12 UTC Identified - We apologize for the delay in updating this incident. Due to database infrastructure issues we have been unable to successfully extract all data from our previous primary database. Any jobs that completed prior to the incident cannot be effectively restarted for updated logs. We continue working towards restoring the relevant records. This is estimated to be a ~5 day process due to the size of the database tables. We are extremely sorry and just as frustrated with this as you probably are. We will provide another update once we have more information. Please email support@travis-ci.com if you have any questions in the mean time.Mar 30, 01:42 UTC Update - We apologize for the delay in updating this incident. Our earlier database maintenance included needing to put a new, empty logs database into production. The previous database still contains the only copy of parts of some build logs, written in the few minutes prior to when the database was no longer able to accept writes. We've been working towards extracting that data this afternoon; so that we can get it archived into S3 and make it accessible again. However, due to the effects of the original problem, we need the assistance of our database infrastructure provider in performing some actions on the previous primary database. We've escalated this with them, but do not currently have an ETA on when this work will be completed. We will provide another update once we have an update from them. Please email support@travis-ci.com if you have any questions in the mean time.Mar 29, 20:01 UTC Monitoring - If you restart jobs that completed before 18:00 UTC , you will continue to see the previous job's output or inconsistent logs displayed. We are working on a resolution for this but it may be a few hours before this is done.Mar 29, 19:50 UTC Update - We are resuming build processing. New build logs should appear, but recent build logs may be unavailable until further notice. We also expect some delay in build processing while we catch up with the back logs.Mar 29, 19:23 UTC Update - We are continuing our database maintenance.Mar 29, 18:52 UTC Update - We are performing emergency database maintenance. We hope to resume processing build requests shortly.Mar 29, 18:22 UTC Update - We are stopping accepting builds while we perform emergency maintenance.Mar 29, 18:06 UTC Identified - Logs database is having problems with writing logs. We are working with our service provider to resolve this issue.Mar 29, 18:00 UTC Investigating - We are investigating an .org logs database problem. Logs are unavailable at this time.

Travis-CI
Slow network connection between sudo-enabled infrastructure and Azure

Apr 7, 14:34 UTC Investigating - We have received reports of slow network connection between jobs running on sudo-enabled infrastructure and Microsoft Azure servers. This typically manifests when the job attempts to upload Docker images and times out after 10 minutes. The problem started around 2017-04-06 01:00 UTC (about 40 hours ago). We are working with our infrastructure service provider to resolve this issue.

Travis-CI
Build delays on macOS infrastructure

Apr 5, 12:23 UTC Resolved - macOS builds are once again being processed normally.Apr 5, 11:13 UTC Monitoring - We've corrected the network settings and are currently monitoring the situation.Apr 5, 10:45 UTC Identified - We've identified the underlying cause as a network misconfiguration on one of our base build VM's.Apr 5, 10:16 UTC Investigating - We are investigating issues with our macOS infrastructure, which are resulting in build delays for a limited amount of jobs.

Travis-CI
Build delays on macOS infrastructure

Apr 5, 11:13 UTC Monitoring - We've corrected the network settings and are currently monitoring the situation.Apr 5, 10:45 UTC Identified - We've identified the underlying cause as a network misconfiguration on one of our base build VM's.Apr 5, 10:16 UTC Investigating - We are investigating issues with our macOS infrastructure, which are resulting in build delays for a limited amount of jobs.

Travis-CI
Build delays on macOS infrastructure

Apr 5, 10:45 UTC Identified - We've identified the underlying cause as a network misconfiguration on one of our base build VM's.Apr 5, 10:16 UTC Investigating - We are investigating issues with our macOS infrastructure, which are resulting in build delays for a limited amount of jobs.

Travis-CI
Build delays on macOS infrastructure

Apr 5, 10:16 UTC Investigating - We are investigating issues with our macOS infrastructure, which are resulting in build delays for a limited amount of jobs.

Travis-CI
Private MacOS Builds backlog

Apr 4, 18:42 UTC Resolved - MacOS private builds backlog has cleared.Apr 4, 16:42 UTC Identified - Private MacOS builds are experiencing a backlog at the moment, we've increase capacity to get through it quicker, expect wait times for another 30 minutes.

Travis-CI
Private MacOS Builds backlog

Apr 4, 16:42 UTC Identified - Private MacOS builds are experiencing a backlog at the moment, we've increase capacity to get through it quicker, expect wait times for another 30 minutes.

Travis-CI
Build delays for public repositories

Apr 3, 20:46 UTC Resolved - Due to high demand on our public repos, we still have a small backlog for sudo-enabled and MacOS builds. Expect short wait times to persist through the afternoon PST /UTC -7. Jobs continued to be processed at full capacity.Apr 3, 18:14 UTC Monitoring - We've identified the issue. It was an enqueue error on our scheduler due to a locked db query. The block was removed, and jobs are being scheduled at full capacity. We will continue to monitor while the backlog of jobs for public repositories clears.Apr 3, 17:12 UTC Investigating - We're investigating build delays affecting all public repositories running at travis-ci.org.

Travis-CI
Mac infrastructure network maintenance

Apr 2, 22:12 UTC Completed - We ran into a bug with one of the new networking components, and have decided to cancel the second stage of the maintenance, and reschedule it for a later time. We've finished the cleanup and we're closing tonight's maintenance window now.Apr 2, 22:03 UTC Verifying - We're working on some finishing cleanup and will be finishing the maintenance window shortly.Apr 2, 20:58 UTC Update - We're still working on the second stage of the maintenance, so we are extending the maintenance window.Apr 2, 20:00 UTC Update - We're starting the second stage of the maintenance now.Apr 2, 19:05 UTC Update - We've finished the first stage of the maintenance. We will start the second stage of the maintenance at 20:00 UTC , in just under an hour.Apr 2, 17:06 UTC In progress - We're beginning the Mac infrastructure network maintenance..Mar 24, 09:30 UTC Scheduled - On Sunday April 2nd, 2017, from 17:00 to 21:00 UTC , we will be performing network maintenance on our Mac infrastructure to improve and test the redundancy in our network stack. The maintenance will be performed in two stages, the first starting at 17:00 UTC and the second starting at 20:00 UTC . We're not expecting any user-visible impact apart from brief packet loss as we perform failover tests.

Travis-CI
Delays in macOS jobs starting for both public and private repositories.

Mar 31, 17:02 UTC Resolved - We've been able to stabilize things and all delays have been resolved at this point. If you continue to see any issues, please email support@travis-ci.comMar 31, 14:33 UTC Identified - We are working to recover the physical hosts and isolate them. We've made some improvements and you should see reduced wait times but delays will continue for now. We'll provide an update when we know more.Mar 31, 14:18 UTC Investigating - We are investigating issues with multiple physical hosts, which is resulting in delays in macOS builds for both public and private repositories.

Travis-CI
travis-ci.org logs database outage.

Mar 31, 16:12 UTC Identified - We apologize for the delay in updating this incident. Due to database infrastructure issues we have been unable to successfully extract all data from our previous primary database. Any jobs that completed prior to the incident cannot be effectively restarted for updated logs. We continue working towards restoring the relevant records. This is estimated to be a ~5 day process due to the size of the database tables. We are extremely sorry and just as frustrated with this as you probably are. We will provide another update once we have more information. Please email support@travis-ci.com if you have any questions in the mean time.Mar 30, 01:42 UTC Update - We apologize for the delay in updating this incident. Our earlier database maintenance included needing to put a new, empty logs database into production. The previous database still contains the only copy of parts of some build logs, written in the few minutes prior to when the database was no longer able to accept writes. We've been working towards extracting that data this afternoon; so that we can get it archived into S3 and make it accessible again. However, due to the effects of the original problem, we need the assistance of our database infrastructure provider in performing some actions on the previous primary database. We've escalated this with them, but do not currently have an ETA on when this work will be completed. We will provide another update once we have an update from them. Please email support@travis-ci.com if you have any questions in the mean time.Mar 29, 20:01 UTC Monitoring - If you restart jobs that completed before 18:00 UTC , you will continue to see the previous job's output or inconsistent logs displayed. We are working on a resolution for this but it may be a few hours before this is done.Mar 29, 19:50 UTC Update - We are resuming build processing. New build logs should appear, but recent build logs may be unavailable until further notice. We also expect some delay in build processing while we catch up with the back logs.Mar 29, 19:23 UTC Update - We are continuing our database maintenance.Mar 29, 18:52 UTC Update - We are performing emergency database maintenance. We hope to resume processing build requests shortly.Mar 29, 18:22 UTC Update - We are stopping accepting builds while we perform emergency maintenance.Mar 29, 18:06 UTC Identified - Logs database is having problems with writing logs. We are working with our service provider to resolve this issue.Mar 29, 18:00 UTC Investigating - We are investigating an .org logs database problem. Logs are unavailable at this time.

Travis-CI
Heightened rate of API errors

Mar 29, 14:09 UTC Resolved - Systems operating normally. Thanks for bearing with us!Mar 29, 13:50 UTC Monitoring - Maintenance has been completed and services appear to be stable. We will continue to monitor.Mar 29, 13:26 UTC Identified - We mitigated the issue and API error rates have recovered. We are continuing with maintenance to make sure the API is stable.Mar 29, 13:04 UTC Investigating - We are investigating reports of the higher rate of API errors for .org.

Travis-CI
Logs display delayed

Mar 28, 11:45 UTC Resolved - This issue has been resolved and logs are working as expected now. Thanks for your patience!Mar 28, 11:03 UTC Monitoring - We've deployed a fix for this issue and logs should be rendering now properly. We're currently monitoring. Please email support@travis-ci.com if you notice any log hiccups.Mar 28, 10:18 UTC Identified - We’ve identified an issue that is causing some logs for both public and private repositories to not render properly. We’re working on a fix for this.Mar 28, 07:52 UTC Investigating - We are investigating reports that logs are not always displayed in web.

Travis-CI
API maintenance for travis-ci.org and travis-ci.com

Mar 27, 21:35 UTC Completed - The scheduled maintenance has been completed.Mar 27, 20:52 UTC Verifying - We've completed the update of the API components on both travis-ci.org and travis-ci.com. We are continuing to monitor closely.Mar 27, 19:43 UTC Update - We've finished the update of the API components on travis-ci.org and we are monitoring things closely. We shall begin updating travis-ci.com shortly.Mar 27, 18:30 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Mar 27, 18:08 UTC Scheduled - We will be performing a maintenance on our API for more efficient access to our logs database. We will first update the API for travis-ci.org and then travis-ci.com with a least an hour in between for monitoring. We don't expect any visible downtime or user impact during the maintenance. Please contact support [at] travis-ci [dot] com if you see something.

Travis-CI
Logs display issues

Mar 27, 08:26 UTC Resolved - Backlog for both open-source-builds and private-builds has now cleared. Thank you for your patience.Mar 27, 08:22 UTC Update - The backlog for our open-source-builds has cleared. The private-builds backlog continues to drop.Mar 27, 08:02 UTC Monitoring - We have identified the issue, applied a fix, and are monitoring as things recover. Display of newer logs may be delayed until we process the backlog.Mar 27, 08:02 UTC Identified - We have identified the issue, applied a fix, and are monitoring as things recover. Display of newer logs may be delayed until we process the backlog.Mar 27, 07:40 UTC Investigating - We’re investigating issues with logs not being displayed.

Travis-CI
Delayed build requests due to webhook delivery delays on GitHub

Mar 24, 12:36 UTC Resolved - Webhook delivery delays have been resolved on GitHub. We have confirmed we're receiving build request events immediately.Mar 24, 12:27 UTC Identified - We are experiencing delayed build requests due to webhook delivery delays on GitHub (up to ~3min). We are monitoring the situation. Also see https://status.github.com.

Travis-CI
Logs display issues for public projects using travis-ci.org

Mar 23, 15:32 UTC Resolved - We have confirmed the effectiveness of the short-term fix. We will continue investigating for a longer-term resolution. Thank you for your patience.Mar 23, 14:55 UTC Monitoring - We have identified the problematic component. While we continue investigating a long-term fix, we have applied a short-time one to have the logs load correctly.Mar 23, 14:13 UTC Investigating - We’re investigating issues displaying public logs while builds are running.

Travis-CI
Delays for sudo-enabled Linux builds

Mar 22, 16:59 UTC Resolved - The issue was confirmed fixed by our infrastructure provider, and we're no longer seeing errors while booting instances. Jobs are running normally across all our infrastructures at this time.Mar 22, 16:29 UTC Identified - Our infrastructure provider has identified an issue and is working on a fix.Mar 22, 15:46 UTC Investigating - We're investigating elevated error rates while booting instances for sudo-enabled Linux builds, which is causing jobs to take longer to start.

Travis-CI
Delays for sudo-enabled Linux builds

Mar 22, 15:46 UTC Investigating - We're investigating elevated error rates while booting instances for sudo-enabled Linux builds, which is causing jobs to take longer to start.

Travis-CI
travis-ci.org website potentially unreachable or slow

Mar 20, 23:43 UTC Resolved - Things have been reliable and stable, so we are resolving this issue.Mar 20, 16:59 UTC Monitoring - The unresponsive host has been removed from the set of DNS records. The site should be fully reachable again. We continue to monitor the situation.Mar 20, 16:29 UTC Identified - We are receiving reports of travis-ci.org being unreachable for some users. One of the 8 edge server IPs appears to be unreachable via TCP. We have escalated the issue to our upstream provider.

Travis-CI
travis-ci.org website potentially unreachable or slow

Mar 20, 16:59 UTC Monitoring - The unresponsive host has been removed from the set of DNS records. The site should be fully reachable again. We continue to monitor the situation.Mar 20, 16:29 UTC Identified - We are receiving reports of travis-ci.org being unreachable for some users. One of the 8 edge server IPs appears to be unreachable via TCP. We have escalated the issue to our upstream provider.

Travis-CI
travis-ci.org website potentially unreachable or slow

Mar 20, 16:29 UTC Identified - We are receiving reports of travis-ci.org being unreachable for some users. One of the 8 edge server IPs appears to be unreachable via TCP. We have escalated the issue to our upstream provider.

Travis-CI
Intermittent slowness from both nodejs.org and registry.yarnpkg.com on container-based builds for public and private repositories

Mar 17, 02:45 UTC Resolved - User reports of issues have cleared up as of approximately Noon PST. We apologize for the delays in resolving this issue. Please email support@travis-ci.com if you see any further issues.Mar 16, 15:30 UTC Update - We are seeing some reports of similar intermittent slowness from both nodejs.org and registry.yarnpkg.com from other users outside of Travis CI's infrastructure, which seems to indicate a potential upstream issue. Restarting affected builds can help and this is recommended for the moment. We are continuing to investigate.Mar 16, 14:20 UTC Investigating - We’re currently investigating slowness and connectivity issues on `sudo: false`, container-based builds. This seems to be specially affecting Node.js installations via nvm.

Travis-CI
Intermittent slowness from both nodejs.org and registry.yarnpkg.com on container-based builds for public and private repositories

Mar 16, 15:30 UTC Update - We are seeing some reports of similar intermittent slowness from both nodejs.org and registry.yarnpkg.com from other users outside of Travis CI's infrastructure, which seems to indicate a potential upstream issue. Restarting affected builds can help and this is recommended for the moment. We are continuing to investigate.Mar 16, 14:20 UTC Investigating - We’re currently investigating slowness and connectivity issues on `sudo: false`, container-based builds. This seems to be specially affecting Node.js installations via nvm.

Travis-CI
Networking issues on container-based builds for public and private repositories

Mar 16, 15:30 UTC Update - We are seeing some reports of similar intermittent slowness from both nodejs.org and registry.yarnpkg.com from other users outside of Travis CI's infrastructure, which seems to indicate a potential upstream issue. Restarting affected builds can help and this is recommended for the moment. We are continuing to investigate.Mar 16, 14:20 UTC Investigating - We’re currently investigating slowness and connectivity issues on `sudo: false`, container-based builds. This seems to be specially affecting Node.js installations via nvm.

Travis-CI
Networking issues on container-based builds for public and private repositories

Mar 16, 14:20 UTC Investigating - We’re currently investigating slowness and connectivity issues on `sudo: false`, container-based builds. This seems to be specially affecting Node.js installations via nvm.

Travis-CI
Planned: Reduced public macOS capacity

Mar 16, 02:40 UTC Resolved - At this time things are now running at our current full capacity for public macOS builds. Please email support@travis-ci.com if you have any questions. Thank you for your patience while we worked through some underlying improvements that required reduced capacity today.Mar 16, 01:29 UTC Update - We are beginning to add capacity back for public macOS jobs and will provide an update when this process is finished.Mar 15, 18:08 UTC Identified - We are in the process of reducing capacity for public macOS builds, in order to support moving some of our hardware capacity for infrastructure improvements. We'll be at reduced capacity for a few hours and we'll provide periodic updates. Please email support@travis-ci.com if you have any questions.

Travis-CI
Planned: Reduced public macOS capacity

Mar 16, 01:29 UTC Update - We are beginning to add capacity back for public macOS jobs and will provide an update when this process is finished.Mar 15, 18:08 UTC Identified - We are in the process of reducing capacity for public macOS builds, in order to support moving some of our hardware capacity for infrastructure improvements. We'll be at reduced capacity for a few hours and we'll provide periodic updates. Please email support@travis-ci.com if you have any questions.

Travis-CI
Network connectivity issues for public and private sudo enabled builds

Mar 15, 22:32 UTC Resolved - Google has resolved the issue and we're seeing jobs executing as expected. Please email support@travis-ci.com if you continue to see any network issues. Thanks for your patience during this issue.Mar 15, 21:21 UTC Update - We're seeing job error rates start to drop, which indicates the issue may be improving, but we're waiting on updates from Google Cloud's status site, https://status.cloud.google.com/incident/compute/17006.Mar 15, 21:00 UTC Identified - Our cloud provider for sudo enabled builds, Google Compute Engine (GCE) is currently experiencing network connectivity issues and this affecting public and private builds. We're monitoring their status incident, https://status.cloud.google.com/incident/compute/17006, closely and will provide updates as we get them. During this incident you'll see failures connectivity to external resources like Launchpad PPAs, PyPI, RubyGems, etc.

Travis-CI
Network connectivity issues for public and private sudo enabled builds

Mar 15, 21:21 UTC Update - We're seeing job error rates start to drop, which indicates the issue may be improving, but we're waiting on updates from Google Cloud's status site, https://status.cloud.google.com/incident/compute/17006.Mar 15, 21:00 UTC Identified - Our cloud provider for sudo enabled builds, Google Compute Engine (GCE) is currently experiencing network connectivity issues and this affecting public and private builds. We're monitoring their status incident, https://status.cloud.google.com/incident/compute/17006, closely and will provide updates as we get them. During this incident you'll see failures connectivity to external resources like Launchpad PPAs, PyPI, RubyGems, etc.

Travis-CI
Network connectivity issues for public and private sudo enabled builds

Mar 15, 21:00 UTC Identified - Our cloud provider for sudo enabled builds, Google Compute Engine (GCE) is currently experiencing network connectivity issues and this affecting public and private builds. We're monitoring their status incident, https://status.cloud.google.com/incident/compute/17006, closely and will provide updates as we get them. During this incident you'll see failures connectivity to external resources like Launchpad PPAs, PyPI, RubyGems, etc.

Travis-CI
Network connectivity issues for public and private sudo enabled builds

Mar 15, 21:00 UTC Identified - Our cloud provider for sud enabled builds, Google Compute Engine (GCE) is currently experiencing network connectivity issues and this affecting public and private builds. We're monitoring their status incident, https://status.cloud.google.com/incident/compute/17006, closely and will provide updates as we get them. During this incident you'll see failures connectivity to external resources like Launchpad PPAs, PyPI, RubyGems, etc.

Travis-CI
Planned: Reduced public macOS capacity

Mar 15, 18:08 UTC Identified - We are in the process of reducing capacity for public macOS builds, in order to support moving some of our hardware capacity for infrastructure improvements. We'll be at reduced capacity for a few hours and we'll provide periodic updates. Please email support@travis-ci.com if you have any questions.

Travis-CI
Planned: Reduced public macOS capacity

Mar 15, 18:08 UTC Identified - We are in the process of reducing capacity for public macOS builds, in order to support moving some of our hardware capacity for infrastructure improvements. We'll be at reduced capacity for a few hours and we'll provide periodic updates. Please email support@travis-ci.com if you have any questions.

Travis-CI
API errors for travis-ci.com

Mar 15, 10:56 UTC Resolved - The API is responding normally again.Mar 15, 10:31 UTC Investigating - We're investigating elevated error rates for the travis-ci.com API, which may also affect the web UI.

Travis-CI
API errors for travis-ci.com

Mar 15, 10:31 UTC Investigating - We're investigating elevated error rates for the travis-ci.com API, which may also affect the web UI.

Travis-CI
Seeing high amounts of errors when trying to start public macOS jobs

Mar 15, 02:45 UTC Resolved - At this time we've restarted the errored jobs and they are in our backlog again. We are processing jobs at full capacity, with the usual backlog due to high demand for public macOS builds. We'll be publishing a postmortem of this incident by the end of the week. We thank you for your patience and understanding while we resolved this issue.Mar 15, 01:52 UTC Update - A mistake in the order we brought up some of the services resulted in all pending macOS jobs "erroring out" quickly. We are currently working on resetting the state of those jobs so that they'll be queued and run again. We are very sorry for this issue and will post an update when the jobs have been requeued.Mar 15, 01:36 UTC Update - We are beginning to ramp up the capacity and are monitoring things closely.Mar 15, 01:10 UTC Update - We've been able to restore the backplane to service and we're working on verification and preparing to resume builds and ramp back up to full capacity.Mar 15, 00:47 UTC Update - The backplane has come up in an unexpected state and we're escalating with our infrastructure provider, as we'll need their help in resolving this issue. In the mean time we continue to run public macOS builds with degraded capacity. travis-ci.com jobs are not affected at this time. Thank you for your patience while we work to resolve this.Mar 15, 00:23 UTC Identified - The issue has been identified and a fix is being implemented.Mar 15, 00:22 UTC Update - The control plane restart is still in progress, we discovered that the root filesystem partition had filled up and we weren't alerted to this issue. We've cleaned up the filesystem and are still working to get the backplane services started up again. In the mean time we continue to run public macOS builds with degraded capacity. travis-ci.com jobs are not affected at this time.Mar 14, 23:43 UTC Update - The "control backplane" for part of our virtualization infrastructure is unstable, so we're initiating a restart of the backplane. In the mean time we continue to run public macOS builds with degraded capacity. travis-ci.com jobs are not affected at this time.Mar 14, 23:27 UTC Update - We are investigating some intermittent stability errors with some of our physical servers for this infrastructure and we are working to restore stability to this. At this time we're running public macOS builds with degraded capacity.Mar 14, 22:59 UTC Investigating - We are seeing high amounts of errors when trying to start public macOS jobs. This is causing build delays for travis-ci.org macOS builds and we are investigating why.

Travis-CI
Seeing high amounts of errors when trying to start public macOS jobs

Mar 15, 01:52 UTC Update - A mistake in the order we brought up some of the services resulted in all pending macOS jobs "erroring out" quickly. We are currently working on resetting the state of those jobs so that they'll be queued and run again. We are very sorry for this issue and will post an update when the jobs have been requeued.Mar 15, 01:36 UTC Update - We are beginning to ramp up the capacity and are monitoring things closely.Mar 15, 01:10 UTC Update - We've been able to restore the backplane to service and we're working on verification and preparing to resume builds and ramp back up to full capacity.Mar 15, 00:47 UTC Update - The backplane has come up in an unexpected state and we're escalating with our infrastructure provider, as we'll need their help in resolving this issue. In the mean time we continue to run public macOS builds with degraded capacity. travis-ci.com jobs are not affected at this time. Thank you for your patience while we work to resolve this.Mar 15, 00:23 UTC Identified - The issue has been identified and a fix is being implemented.Mar 15, 00:22 UTC Update - The control plane restart is still in progress, we discovered that the root filesystem partition had filled up and we weren't alerted to this issue. We've cleaned up the filesystem and are still working to get the backplane services started up again. In the mean time we continue to run public macOS builds with degraded capacity. travis-ci.com jobs are not affected at this time.Mar 14, 23:43 UTC Update - The "control backplane" for part of our virtualization infrastructure is unstable, so we're initiating a restart of the backplane. In the mean time we continue to run public macOS builds with degraded capacity. travis-ci.com jobs are not affected at this time.Mar 14, 23:27 UTC Update - We are investigating some intermittent stability errors with some of our physical servers for this infrastructure and we are working to restore stability to this. At this time we're running public macOS builds with degraded capacity.Mar 14, 22:59 UTC Investigating - We are seeing high amounts of errors when trying to start public macOS jobs. This is causing build delays for travis-ci.org macOS builds and we are investigating why.

Travis-CI
Seeing high amounts of errors when trying to start public macOS jobs

Mar 15, 01:36 UTC Update - We are beginning to ramp up the capacity and are monitoring things closely.Mar 15, 01:10 UTC Update - We've been able to restore the backplane to service and we're working on verification and preparing to resume builds and ramp back up to full capacity.Mar 15, 00:47 UTC Update - The backplane has come up in an unexpected state and we're escalating with our infrastructure provider, as we'll need their help in resolving this issue. In the mean time we continue to run public macOS builds with degraded capacity. travis-ci.com jobs are not affected at this time. Thank you for your patience while we work to resolve this.Mar 15, 00:23 UTC Identified - The issue has been identified and a fix is being implemented.Mar 15, 00:22 UTC Update - The control plane restart is still in progress, we discovered that the root filesystem partition had filled up and we weren't alerted to this issue. We've cleaned up the filesystem and are still working to get the backplane services started up again. In the mean time we continue to run public macOS builds with degraded capacity. travis-ci.com jobs are not affected at this time.Mar 14, 23:43 UTC Update - The "control backplane" for part of our virtualization infrastructure is unstable, so we're initiating a restart of the backplane. In the mean time we continue to run public macOS builds with degraded capacity. travis-ci.com jobs are not affected at this time.Mar 14, 23:27 UTC Update - We are investigating some intermittent stability errors with some of our physical servers for this infrastructure and we are working to restore stability to this. At this time we're running public macOS builds with degraded capacity.Mar 14, 22:59 UTC Investigating - We are seeing high amounts of errors when trying to start public macOS jobs. This is causing build delays for travis-ci.org macOS builds and we are investigating why.

Travis-CI
Seeing high amounts of errors when trying to start public macOS jobs

Mar 15, 01:10 UTC Update - We've been able to restore the backplane to service and we're working on verification and preparing to resume builds and ramp back up to full capacity.Mar 15, 00:47 UTC Update - The backplane has come up in an unexpected state and we're escalating with our infrastructure provider, as we'll need their help in resolving this issue. In the mean time we continue to run public macOS builds with degraded capacity. travis-ci.com jobs are not affected at this time. Thank you for your patience while we work to resolve this.Mar 15, 00:23 UTC Identified - The issue has been identified and a fix is being implemented.Mar 15, 00:22 UTC Update - The control plane restart is still in progress, we discovered that the root filesystem partition had filled up and we weren't alerted to this issue. We've cleaned up the filesystem and are still working to get the backplane services started up again. In the mean time we continue to run public macOS builds with degraded capacity. travis-ci.com jobs are not affected at this time.Mar 14, 23:43 UTC Update - The "control backplane" for part of our virtualization infrastructure is unstable, so we're initiating a restart of the backplane. In the mean time we continue to run public macOS builds with degraded capacity. travis-ci.com jobs are not affected at this time.Mar 14, 23:27 UTC Update - We are investigating some intermittent stability errors with some of our physical servers for this infrastructure and we are working to restore stability to this. At this time we're running public macOS builds with degraded capacity.Mar 14, 22:59 UTC Investigating - We are seeing high amounts of errors when trying to start public macOS jobs. This is causing build delays for travis-ci.org macOS builds and we are investigating why.

Travis-CI
Seeing high amounts of errors when trying to start public macOS jobs

Mar 15, 00:47 UTC Update - The backplane has come up in an unexpected state and we're escalating with our infrastructure provider, as we'll need their help in resolving this issue. In the mean time we continue to run public macOS builds with degraded capacity. travis-ci.com jobs are not affected at this time. Thank you for your patience while we work to resolve this.Mar 15, 00:23 UTC Identified - The issue has been identified and a fix is being implemented.Mar 15, 00:22 UTC Update - The control plane restart is still in progress, we discovered that the root filesystem partition had filled up and we weren't alerted to this issue. We've cleaned up the filesystem and are still working to get the backplane services started up again. In the mean time we continue to run public macOS builds with degraded capacity. travis-ci.com jobs are not affected at this time.Mar 14, 23:43 UTC Update - The "control backplane" for part of our virtualization infrastructure is unstable, so we're initiating a restart of the backplane. In the mean time we continue to run public macOS builds with degraded capacity. travis-ci.com jobs are not affected at this time.Mar 14, 23:27 UTC Update - We are investigating some intermittent stability errors with some of our physical servers for this infrastructure and we are working to restore stability to this. At this time we're running public macOS builds with degraded capacity.Mar 14, 22:59 UTC Investigating - We are seeing high amounts of errors when trying to start public macOS jobs. This is causing build delays for travis-ci.org macOS builds and we are investigating why.

Travis-CI
Seeing high amounts of errors when trying to start public macOS jobs

Mar 15, 00:23 UTC Identified - The issue has been identified and a fix is being implemented.Mar 15, 00:22 UTC Update - The control plane restart is still in progress, we discovered that the root filesystem partition had filled up and we weren't alerted to this issue. We've cleaned up the filesystem and are still working to get the backplane services started up again. In the mean time we continue to run public macOS builds with degraded capacity. travis-ci.com jobs are not affected at this time.Mar 14, 23:43 UTC Update - The "control backplane" for part of our virtualization infrastructure is unstable, so we're initiating a restart of the backplane. In the mean time we continue to run public macOS builds with degraded capacity. travis-ci.com jobs are not affected at this time.Mar 14, 23:27 UTC Update - We are investigating some intermittent stability errors with some of our physical servers for this infrastructure and we are working to restore stability to this. At this time we're running public macOS builds with degraded capacity.Mar 14, 22:59 UTC Investigating - We are seeing high amounts of errors when trying to start public macOS jobs. This is causing build delays for travis-ci.org macOS builds and we are investigating why.

Travis-CI
Seeing high amounts of errors when trying to start public macOS jobs

Mar 14, 23:43 UTC Update - The "control backplane" for part of our virtualization infrastructure is unstable, so we're initiating a restart of the backplane. In the mean time we continue to run public macOS builds with degraded capacity. travis-ci.com jobs are not affected at this time.Mar 14, 23:27 UTC Update - We are investigating some intermittent stability errors with some of our physical servers for this infrastructure and we are working to restore stability to this. At this time we're running public macOS builds with degraded capacity.Mar 14, 22:59 UTC Investigating - We are seeing high amounts of errors when trying to start public macOS jobs. This is causing build delays for travis-ci.org macOS builds and we are investigating why.

Travis-CI
Seeing high amounts of errors when trying to start public macOS jobs

Mar 14, 23:27 UTC Update - We are investigating some intermittent stability errors with some of our physical servers for this infrastructure and we are working to restore stability to this. At this time we're running public macOS builds with degraded capacity.Mar 14, 22:59 UTC Investigating - We are seeing high amounts of errors when trying to start public macOS jobs. This is causing build delays for travis-ci.org macOS builds and we are investigating why.

Travis-CI
Seeing high amounts of errors when trying to start public macOS jobs

Mar 14, 22:59 UTC Investigating - We are seeing high amounts of errors when trying to start public macOS jobs. This is causing build delays for travis-ci.org macOS builds and we are investigating why.

Travis-CI
Delays creating Pull Request builds

Mar 14, 15:51 UTC Resolved - We’ve identified and fixed an issue which has caused Pull Request builds to be delayed for approximately 20 minutes. Pull request builds should be now processing normally. Thank you!

Travis-CI
Missing build requests for Pull Requests

Mar 13, 20:47 UTC Resolved - The rollback has successfully resolved the issue. The underlying issue is with some dependency version conflicts that weren't caught before and we're working on resolving that conflict, but it is not needed to help resolve this issue. Please email support@travis-ci.com if you see any further issues. Thanks for your patience while we resolved this issue.Mar 13, 20:01 UTC Identified - We've identified an issue with a deploy, that was resulting in some PR builds, such as ones with merge conflicts, not running. We've rolled back this release and confirmed that new builds with merge conflicts will run. You'll need to open/close existing PRs **or** push a new commit if open/close does not work. We're still working on a fix for the issue. Thanks for your patience.Mar 13, 19:35 UTC Investigating - We've found and are currently investigating reports of missing builds for pull request events

Travis-CI
Missing build requests for Pull Requests

Mar 13, 20:01 UTC Identified - We've identified an issue with a deploy, that was resulting in some PR builds, such as ones with merge conflicts, not running. We've rolled back this release and confirmed that new builds with merge conflicts will run. You'll need to open/close existing PRs **or** push a new commit if open/close does not work. We're still working on a fix for the issue. Thanks for your patience.Mar 13, 19:35 UTC Investigating - We've found and are currently investigating reports of missing builds for pull request events

Travis-CI
Missing build requests for Pull Requests

Mar 13, 19:35 UTC Investigating - We've found and are currently investigating reports of missing builds for pull request events

Travis-CI
Missing build requests for Pull Requests

Mar 13, 19:35 UTC Investigating - We've found and are currently investigating reports of missing builds for pull request events

Travis-CI
Log delays on travis-ci.com

Mar 13, 15:24 UTC Resolved - We've processed the backlog and log messages are appearing in real-time again.Mar 13, 15:00 UTC Investigating - We're investigating delays in log messages for builds on travis-ci.com. Builds will run normally and get marked as finished, but the logs may take a few minutes to appear after the build is finished.

Travis-CI
Log delays on travis-ci.com

Mar 13, 15:00 UTC Investigating - We're investigating delays in log messages for builds on travis-ci.com. Builds will run normally and get marked as finished, but the logs may take a few minutes to appear after the build is finished.

Travis-CI
Missing build requests for Pull Request events

Mar 10, 18:43 UTC Resolved - The issue has been resolved.Mar 10, 18:27 UTC Update - We have a report of close/reopen not triggering a PR build. In this case, try pushing a new commit (or force on top).Mar 10, 18:03 UTC Monitoring - We have deployed the fix for the upstream API change. As previous PR build requests are lost, please close and reopen the affected PRs to trigger builds.Mar 10, 17:52 UTC Identified - We have traced the missing builds issue to a change in the upstream GitHub API that introduced new PR statuses we have not previously accounted for in our GitHub API client. This caused PR requests to be dropped. We are working on a fix to account for the new statuses.Mar 10, 16:21 UTC Investigating - We're currently investigating reports of missing builds for Pull Requests in https://travis-ci.com and https://travis-ci.org

Travis-CI
Missing build requests for Pull Request events

Mar 10, 18:27 UTC Update - We have a report of close/reopen not triggering a PR build. In this case, try pushing a new commit (or force on top).Mar 10, 18:03 UTC Monitoring - We have deployed the fix for the upstream API change. As previous PR build requests are lost, please close and reopen the affected PRs to trigger builds.Mar 10, 17:52 UTC Identified - We have traced the missing builds issue to a change in the upstream GitHub API that introduced new PR statuses we have not previously accounted for in our GitHub API client. This caused PR requests to be dropped. We are working on a fix to account for the new statuses.Mar 10, 16:21 UTC Investigating - We're currently investigating reports of missing builds for Pull Requests in https://travis-ci.com and https://travis-ci.org

Travis-CI
Missing build requests for Pull Request events

Mar 10, 18:03 UTC Monitoring - We have deployed the fix for the upstream API change. As previous PR build requests are lost, please close and reopen the affected PRs to trigger builds.Mar 10, 17:52 UTC Identified - We have traced the missing builds issue to a change in the upstream GitHub API that introduced new PR statuses we have not previously accounted for in our GitHub API client. This caused PR requests to be dropped. We are working on a fix to account for the new statuses.Mar 10, 16:21 UTC Investigating - We're currently investigating reports of missing builds for Pull Requests in https://travis-ci.com and https://travis-ci.org

Travis-CI
Missing build requests for Pull Request events

Mar 10, 17:52 UTC Identified - We have traced the missing builds issue to a change in the upstream GitHub API that introduced new PR statuses we have not previously accounted for in our GitHub API client. This caused PR requests to be dropped. We are working on a fix to account for the new statuses.Mar 10, 16:21 UTC Investigating - We're currently investigating reports of missing builds for Pull Requests in https://travis-ci.com and https://travis-ci.org

Travis-CI
Missing build requests for Pull Request events

Mar 10, 16:21 UTC Investigating - We're currently investigating reports of missing builds for Pull Requests in https://travis-ci.com and https://travis-ci.org

Travis-CI
Missing private build requests for Pull Request events

Mar 10, 16:21 UTC Investigating - We're currently investigating reports of missing builds for Pull Requests in https://travis-ci.com

Travis-CI
Network maintenance on Mac infrastructure

Mar 8, 10:18 UTC Completed - The maintenance completed successfully.Mar 8, 09:00 UTC In progress - We've started the maintenance.Mar 7, 18:23 UTC Scheduled - We will be performing some network maintenance on our Mac infrastructure to add capacity to our network stack. We are performing the maintenance in stages and the components involved all have failovers, so we don't expect any visible downtime or user impact during the maintenance.

Travis-CI
Network maintenance on Mac infrastructure

Mar 8, 09:00 UTC In progress - We've started the maintenance.Mar 7, 18:23 UTC Scheduled - We will be performing some network maintenance on our Mac infrastructure to add capacity to our network stack. We are performing the maintenance in stages and the components involved all have failovers, so we don't expect any visible downtime or user impact during the maintenance.

Travis-CI
Network maintenance on Mac infrastructure

Mar 7, 18:23 UTC Scheduled - We will be performing some network maintenance on our Mac infrastructure to add capacity to our network stack. We are performing the maintenance in stages and the components involved all have failovers, so we don't expect any visible downtime or user impact during the maintenance.

Travis-CI
Reduced capacity for private repo Mac Builds

Mar 7, 21:45 UTC Resolved - We have finished bringing additional capacity for private macOS builds online and caught up on existing backlog. Everything is operating as expected.Mar 7, 19:02 UTC Identified - Mac Builds for private repositories are currently backlogged while we bring additional capacity online.

Travis-CI
Reduced capacity for private repo Mac Builds

Mar 7, 19:02 UTC Identified - Mac Builds for private repositories are currently backlogged while we bring additional capacity online.

Travis-CI
Database maintenance for private repo logs

Mar 5, 03:26 UTC Completed - The scheduled maintenance has been completed.Mar 5, 03:18 UTC Verifying - Verification is currently underway for the maintenance items.Mar 5, 03:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Mar 3, 21:49 UTC Scheduled - We need to promote the replica database used for job logs on private repositories. Our infrastructure provider has advised us to migrate to a newer instance within 30 days in order to ensure stability and avoid issues like we recently encountered (See: https://www.traviscistatus.com/incidents/hx7cnxbch9xf).

Travis-CI
Database maintenance for private repo logs

Mar 5, 03:18 UTC Verifying - Verification is currently underway for the maintenance items.Mar 5, 03:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Mar 3, 21:49 UTC Scheduled - We need to promote the replica database used for job logs on private repositories. Our infrastructure provider has advised us to migrate to a newer instance within 30 days in order to ensure stability and avoid issues like we recently encountered (See: https://www.traviscistatus.com/incidents/hx7cnxbch9xf).

Travis-CI
Database maintenance for private repo logs

Mar 5, 03:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Mar 3, 21:49 UTC Scheduled - We need to promote the replica database used for job logs on private repositories. Our infrastructure provider has advised us to migrate to a newer instance within 30 days in order to ensure stability and avoid issues like we recently encountered (See: https://www.traviscistatus.com/incidents/hx7cnxbch9xf).

Travis-CI
Database maintenance for private repo logs

Mar 3, 21:49 UTC Scheduled - We need to promote the replica database used for job logs on private repositories. Our infrastructure provider has advised us to migrate to a newer instance within 30 days in order to ensure stability and avoid issues like we recently encountered (See: https://www.traviscistatus.com/incidents/hx7cnxbch9xf).

Travis-CI
Partial API/logs service outage for travis-ci.org

Mar 2, 09:49 UTC Resolved - We've been monitoring our system for a number of hours and things are now stable. Thanks again for your patience over the last few days.Mar 2, 04:28 UTC Monitoring - A fix has been implemented and we are monitoring the results.Mar 2, 02:36 UTC Update - We've resumed all build processing at this point. Builds are starting and running as expected. Logs display via the API and web UI is functional as well. We will be monitoring things closely for the next few hours and into tomorrow. Thank you to everyone for your patience, understanding, and the many kind words via Twitter.Mar 2, 02:20 UTC Update - The database work is done. We are in the process of resuming services and beginning to process jobs again. We're still verifying things and will post another update once we're confident jobs should be being processed as expected.Mar 2, 01:48 UTC Update - Our database provider has asked to make some changes to the existing primary logs DB that require we stop processing new jobs temporarily. So all builds will be paused and logs display will result in an error from the API or web UI. We'll post an update once we've resumed builds.Mar 2, 01:07 UTC Update - We are currently waiting on a new replica logs database to finish provisioning and we plan to fail over to it once it is ready, which we expect to happen roughly 5 hours. Until then delays in log displays and some errors from the API/web UI should be expected. We are sorry for the extended length of this issue and appreciate your patience while we work through this issue with our database infrastructure provider.Mar 1, 21:41 UTC Update - We are still working on a fix with our infrastructure provider.Mar 1, 20:14 UTC Update - We're currently mostly stable, and we're actively working with our infrastructure provider on a more complete fix. Thanks for hanging in there with us!Mar 1, 15:52 UTC Update - We have found a way to mitigate our degraded API performance in the short term. We continue to monitor performance and wait for the emergency failover database to provision. We are still experiencing a delay of logs in our web front end and will report back as soon as we can.Mar 1, 14:48 UTC Update - Our ongoing database connection issues are due to emergency maintenance following the recent AWS outage. We are working with our upstream provider to rectify a kernel bug and are currently waiting for a new database failover to be provisioned. We expect this to take some time, and will continue to post updates as we have them.Mar 1, 11:53 UTC Identified - We have traced the partial outage to an intermittent database connection issue, and we're working to resolve it.Mar 1, 09:16 UTC Investigating - We are experiencing a partial API outage on travis-ci.org, which is affecting performance of our web front end.

Travis-CI
Partial API/logs service outage for travis-ci.org

Mar 2, 04:28 UTC Monitoring - A fix has been implemented and we are monitoring the results.Mar 2, 02:36 UTC Update - We've resumed all build processing at this point. Builds are starting and running as expected. Logs display via the API and web UI is functional as well. We will be monitoring things closely for the next few hours and into tomorrow. Thank you to everyone for your patience, understanding, and the many kind words via Twitter.Mar 2, 02:20 UTC Update - The database work is done. We are in the process of resuming services and beginning to process jobs again. We're still verifying things and will post another update once we're confident jobs should be being processed as expected.Mar 2, 01:48 UTC Update - Our database provider has asked to make some changes to the existing primary logs DB that require we stop processing new jobs temporarily. So all builds will be paused and logs display will result in an error from the API or web UI. We'll post an update once we've resumed builds.Mar 2, 01:07 UTC Update - We are currently waiting on a new replica logs database to finish provisioning and we plan to fail over to it once it is ready, which we expect to happen roughly 5 hours. Until then delays in log displays and some errors from the API/web UI should be expected. We are sorry for the extended length of this issue and appreciate your patience while we work through this issue with our database infrastructure provider.Mar 1, 21:41 UTC Update - We are still working on a fix with our infrastructure provider.Mar 1, 20:14 UTC Update - We're currently mostly stable, and we're actively working with our infrastructure provider on a more complete fix. Thanks for hanging in there with us!Mar 1, 15:52 UTC Update - We have found a way to mitigate our degraded API performance in the short term. We continue to monitor performance and wait for the emergency failover database to provision. We are still experiencing a delay of logs in our web front end and will report back as soon as we can.Mar 1, 14:48 UTC Update - Our ongoing database connection issues are due to emergency maintenance following the recent AWS outage. We are working with our upstream provider to rectify a kernel bug and are currently waiting for a new database failover to be provisioned. We expect this to take some time, and will continue to post updates as we have them.Mar 1, 11:53 UTC Identified - We have traced the partial outage to an intermittent database connection issue, and we're working to resolve it.Mar 1, 09:16 UTC Investigating - We are experiencing a partial API outage on travis-ci.org, which is affecting performance of our web front end.

Travis-CI
Partial API/logs service outage for travis-ci.org

Mar 2, 02:36 UTC Update - We've resumed all build processing at this point. Builds are starting and running as expected. Logs display via the API and web UI is functional as well. We will be monitoring things closely for the next few hours and into tomorrow. Thank you to everyone for your patience, understanding, and the many kind words via Twitter.Mar 2, 02:20 UTC Update - The database work is done. We are in the process of resuming services and beginning to process jobs again. We're still verifying things and will post another update once we're confident jobs should be being processed as expected.Mar 2, 01:48 UTC Update - Our database provider has asked to make some changes to the existing primary logs DB that require we stop processing new jobs temporarily. So all builds will be paused and logs display will result in an error from the API or web UI. We'll post an update once we've resumed builds.Mar 2, 01:07 UTC Update - We are currently waiting on a new replica logs database to finish provisioning and we plan to fail over to it once it is ready, which we expect to happen roughly 5 hours. Until then delays in log displays and some errors from the API/web UI should be expected. We are sorry for the extended length of this issue and appreciate your patience while we work through this issue with our database infrastructure provider.Mar 1, 21:41 UTC Update - We are still working on a fix with our infrastructure provider.Mar 1, 20:14 UTC Update - We're currently mostly stable, and we're actively working with our infrastructure provider on a more complete fix. Thanks for hanging in there with us!Mar 1, 15:52 UTC Update - We have found a way to mitigate our degraded API performance in the short term. We continue to monitor performance and wait for the emergency failover database to provision. We are still experiencing a delay of logs in our web front end and will report back as soon as we can.Mar 1, 14:48 UTC Update - Our ongoing database connection issues are due to emergency maintenance following the recent AWS outage. We are working with our upstream provider to rectify a kernel bug and are currently waiting for a new database failover to be provisioned. We expect this to take some time, and will continue to post updates as we have them.Mar 1, 11:53 UTC Identified - We have traced the partial outage to an intermittent database connection issue, and we're working to resolve it.Mar 1, 09:16 UTC Investigating - We are experiencing a partial API outage on travis-ci.org, which is affecting performance of our web front end.

Travis-CI
Partial API outage

Mar 2, 02:36 UTC Update - We've resumed all build processing at this point. Builds are starting and running as expected. Logs display via the API and web UI is functional as well. We will be monitoring things closely for the next few hours and into tomorrow. Thank you to everyone for your patience, understanding, and the many kind words via Twitter.Mar 2, 02:20 UTC Update - The database work is done. We are in the process of resuming services and beginning to process jobs again. We're still verifying things and will post another update once we're confident jobs should be being processed as expected.Mar 2, 01:48 UTC Update - Our database provider has asked to make some changes to the existing primary logs DB that require we stop processing new jobs temporarily. So all builds will be paused and logs display will result in an error from the API or web UI. We'll post an update once we've resumed builds.Mar 2, 01:07 UTC Update - We are currently waiting on a new replica logs database to finish provisioning and we plan to fail over to it once it is ready, which we expect to happen roughly 5 hours. Until then delays in log displays and some errors from the API/web UI should be expected. We are sorry for the extended length of this issue and appreciate your patience while we work through this issue with our database infrastructure provider.Mar 1, 21:41 UTC Update - We are still working on a fix with our infrastructure provider.Mar 1, 20:14 UTC Update - We're currently mostly stable, and we're actively working with our infrastructure provider on a more complete fix. Thanks for hanging in there with us!Mar 1, 15:52 UTC Update - We have found a way to mitigate our degraded API performance in the short term. We continue to monitor performance and wait for the emergency failover database to provision. We are still experiencing a delay of logs in our web front end and will report back as soon as we can.Mar 1, 14:48 UTC Update - Our ongoing database connection issues are due to emergency maintenance following the recent AWS outage. We are working with our upstream provider to rectify a kernel bug and are currently waiting for a new database failover to be provisioned. We expect this to take some time, and will continue to post updates as we have them.Mar 1, 11:53 UTC Identified - We have traced the partial outage to an intermittent database connection issue, and we're working to resolve it.Mar 1, 09:16 UTC Investigating - We are experiencing a partial API outage on travis-ci.org, which is affecting performance of our web front end.

Travis-CI
Partial API outage

Mar 2, 02:20 UTC Update - The database work is done. We are in the process of resuming services and beginning to process jobs again. We're still verifying things and will post another update once we're confident jobs should be being processed as expected.Mar 2, 01:48 UTC Update - Our database provider has asked to make some changes to the existing primary logs DB that require we stop processing new jobs temporarily. So all builds will be paused and logs display will result in an error from the API or web UI. We'll post an update once we've resumed builds.Mar 2, 01:07 UTC Update - We are currently waiting on a new replica logs database to finish provisioning and we plan to fail over to it once it is ready, which we expect to happen roughly 5 hours. Until then delays in log displays and some errors from the API/web UI should be expected. We are sorry for the extended length of this issue and appreciate your patience while we work through this issue with our database infrastructure provider.Mar 1, 21:41 UTC Update - We are still working on a fix with our infrastructure provider.Mar 1, 20:14 UTC Update - We're currently mostly stable, and we're actively working with our infrastructure provider on a more complete fix. Thanks for hanging in there with us!Mar 1, 15:52 UTC Update - We have found a way to mitigate our degraded API performance in the short term. We continue to monitor performance and wait for the emergency failover database to provision. We are still experiencing a delay of logs in our web front end and will report back as soon as we can.Mar 1, 14:48 UTC Update - Our ongoing database connection issues are due to emergency maintenance following the recent AWS outage. We are working with our upstream provider to rectify a kernel bug and are currently waiting for a new database failover to be provisioned. We expect this to take some time, and will continue to post updates as we have them.Mar 1, 11:53 UTC Identified - We have traced the partial outage to an intermittent database connection issue, and we're working to resolve it.Mar 1, 09:16 UTC Investigating - We are experiencing a partial API outage on travis-ci.org, which is affecting performance of our web front end.

Travis-CI
Partial API outage

Mar 2, 01:48 UTC Update - Our database provider has asked to make some changes to the existing primary logs DB that require we stop processing new jobs temporarily. So all builds will be paused and logs display will result in an error from the API or web UI. We'll post an update once we've resumed builds.Mar 2, 01:07 UTC Update - We are currently waiting on a new replica logs database to finish provisioning and we plan to fail over to it once it is ready, which we expect to happen roughly 5 hours. Until then delays in log displays and some errors from the API/web UI should be expected. We are sorry for the extended length of this issue and appreciate your patience while we work through this issue with our database infrastructure provider.Mar 1, 21:41 UTC Update - We are still working on a fix with our infrastructure provider.Mar 1, 20:14 UTC Update - We're currently mostly stable, and we're actively working with our infrastructure provider on a more complete fix. Thanks for hanging in there with us!Mar 1, 15:52 UTC Update - We have found a way to mitigate our degraded API performance in the short term. We continue to monitor performance and wait for the emergency failover database to provision. We are still experiencing a delay of logs in our web front end and will report back as soon as we can.Mar 1, 14:48 UTC Update - Our ongoing database connection issues are due to emergency maintenance following the recent AWS outage. We are working with our upstream provider to rectify a kernel bug and are currently waiting for a new database failover to be provisioned. We expect this to take some time, and will continue to post updates as we have them.Mar 1, 11:53 UTC Identified - We have traced the partial outage to an intermittent database connection issue, and we're working to resolve it.Mar 1, 09:16 UTC Investigating - We are experiencing a partial API outage on travis-ci.org, which is affecting performance of our web front end.

Travis-CI
Partial API outage

Mar 2, 01:07 UTC Update - We are currently waiting on a new replica logs database to finish provisioning and we plan to fail over to it once it is ready, which we expect to happen roughly 5 hours. Until then delays in log displays and some errors from the API/web UI should be expected. We are sorry for the extended length of this issue and appreciate your patience while we work through this issue with our database infrastructure provider.Mar 1, 21:41 UTC Update - We are still working on a fix with our infrastructure provider.Mar 1, 20:14 UTC Update - We're currently mostly stable, and we're actively working with our infrastructure provider on a more complete fix. Thanks for hanging in there with us!Mar 1, 15:52 UTC Update - We have found a way to mitigate our degraded API performance in the short term. We continue to monitor performance and wait for the emergency failover database to provision. We are still experiencing a delay of logs in our web front end and will report back as soon as we can.Mar 1, 14:48 UTC Update - Our ongoing database connection issues are due to emergency maintenance following the recent AWS outage. We are working with our upstream provider to rectify a kernel bug and are currently waiting for a new database failover to be provisioned. We expect this to take some time, and will continue to post updates as we have them.Mar 1, 11:53 UTC Identified - We have traced the partial outage to an intermittent database connection issue, and we're working to resolve it.Mar 1, 09:16 UTC Investigating - We are experiencing a partial API outage on travis-ci.org, which is affecting performance of our web front end.

Travis-CI
Partial API outage

Mar 1, 21:41 UTC Update - We are still working on a fix with our infrastructure provider.Mar 1, 20:14 UTC Update - We're currently mostly stable, and we're actively working with our infrastructure provider on a more complete fix. Thanks for hanging in there with us!Mar 1, 15:52 UTC Update - We have found a way to mitigate our degraded API performance in the short term. We continue to monitor performance and wait for the emergency failover database to provision. We are still experiencing a delay of logs in our web front end and will report back as soon as we can.Mar 1, 14:48 UTC Update - Our ongoing database connection issues are due to emergency maintenance following the recent AWS outage. We are working with our upstream provider to rectify a kernel bug and are currently waiting for a new database failover to be provisioned. We expect this to take some time, and will continue to post updates as we have them.Mar 1, 11:53 UTC Identified - We have traced the partial outage to an intermittent database connection issue, and we're working to resolve it.Mar 1, 09:16 UTC Investigating - We are experiencing a partial API outage on travis-ci.org, which is affecting performance of our web front end.

Travis-CI
Partial API outage

Mar 1, 20:14 UTC Update - We're currently mostly stable, and we're actively working with our infrastructure provider on a more complete fix. Thanks for hanging in there with us!Mar 1, 15:52 UTC Update - We have found a way to mitigate our degraded API performance in the short term. We continue to monitor performance and wait for the emergency failover database to provision. We are still experiencing a delay of logs in our web front end and will report back as soon as we can.Mar 1, 14:48 UTC Update - Our ongoing database connection issues are due to emergency maintenance following the recent AWS outage. We are working with our upstream provider to rectify a kernel bug and are currently waiting for a new database failover to be provisioned. We expect this to take some time, and will continue to post updates as we have them.Mar 1, 11:53 UTC Identified - We have traced the partial outage to an intermittent database connection issue, and we're working to resolve it.Mar 1, 09:16 UTC Investigating - We are experiencing a partial API outage on travis-ci.org, which is affecting performance of our web front end.

Travis-CI
Partial API outage

Mar 1, 15:52 UTC Update - We have found a way to mitigate our degraded API performance in the short term. We continue to monitor performance and wait for the emergency failover database to provision. We are still experiencing a delay of logs in our web front end and will report back as soon as we can.Mar 1, 14:48 UTC Update - Our ongoing database connection issues are due to emergency maintenance following the recent AWS outage. We are working with our upstream provider to rectify a kernel bug and are currently waiting for a new database failover to be provisioned. We expect this to take some time, and will continue to post updates as we have them.Mar 1, 11:53 UTC Identified - We have traced the partial outage to an intermittent database connection issue, and we're working to resolve it.Mar 1, 09:16 UTC Investigating - We are experiencing a partial API outage on travis-ci.org, which is affecting performance of our web front end.

Travis-CI
Partial API outage

Mar 1, 14:48 UTC Update - Our ongoing database connection issues are due to emergency maintenance following the recent AWS outage. We are working with our upstream provider to rectify a kernel bug and are currently waiting for a new database failover to be provisioned. We expect this to take some time, and will continue to post updates as we have them.Mar 1, 11:53 UTC Identified - We have traced the partial outage to an intermittent database connection issue, and we're working to resolve it.Mar 1, 09:16 UTC Investigating - We are experiencing a partial API outage on travis-ci.org, which is affecting performance of our web front end.

Travis-CI
Partial API outage

Mar 1, 11:53 UTC Identified - We have traced the partial outage to an intermittent database connection issue, and we're working to resolve it.Mar 1, 09:16 UTC Investigating - We are experiencing a partial API outage on travis-ci.org, which is affecting performance of our web front end.

Travis-CI
Partial API outage

Mar 1, 09:16 UTC Investigating - We are experiencing a partial API outage on travis-ci.org, which is affecting performance of our web front end.

Travis-CI
Partial API outage

Mar 1, 09:16 UTC Investigating - We are experiencing a partial API outage, which is affecting performance of our web front end.

Travis-CI
Issues related to the S3 outage in AWS.

Mar 1, 03:40 UTC Resolved - Jobs are processing normally. Thank you for your patience.Mar 1, 02:12 UTC Update - We are processing normally, though there is still a job processing backlog. We are monitoring stability closely while the backlog clears.Mar 1, 00:43 UTC Monitoring - We are currently processing a large job backlog as part of fallout from the S3 incident. All services are functioning normally, but you may still notice delays until the backlogs clear.Feb 28, 23:14 UTC Update - We are seeing some services recover after the S3 outage, and jobs are processing. We are watching recovery closely, and will keep updating if anything goes awry.Feb 28, 21:11 UTC Update - We are still waiting for Amazon S3 to recover.Feb 28, 18:10 UTC Identified - AWS has confirmed that S3 is experiencing issues. We've taken some actions to maintain current container-based capacity and are monitoring the S3 status and overall healthy of our infrastructure closely.Feb 28, 18:00 UTC Investigating - We are investigating issues related to the S3 outage in AWS. Currently build logs older than a few hours will fail to load. Build caches for container-based builds are unavailable. Builds that depend on resources like Docker Hub, Quay.io, or other S3 dependent third party services will fail with errors related to being unable to access the resources.

Travis-CI
Issues related to the S3 outage in AWS.

Mar 1, 02:12 UTC Update - We are processing normally, though there is still a job processing backlog. We are monitoring stability closely while the backlog clears.Mar 1, 00:43 UTC Monitoring - We are currently processing a large job backlog as part of fallout from the S3 incident. All services are functioning normally, but you may still notice delays until the backlogs clear.Feb 28, 23:14 UTC Update - We are seeing some services recover after the S3 outage, and jobs are processing. We are watching recovery closely, and will keep updating if anything goes awry.Feb 28, 21:11 UTC Update - We are still waiting for Amazon S3 to recover.Feb 28, 18:10 UTC Identified - AWS has confirmed that S3 is experiencing issues. We've taken some actions to maintain current container-based capacity and are monitoring the S3 status and overall healthy of our infrastructure closely.Feb 28, 18:00 UTC Investigating - We are investigating issues related to the S3 outage in AWS. Currently build logs older than a few hours will fail to load. Build caches for container-based builds are unavailable. Builds that depend on resources like Docker Hub, Quay.io, or other S3 dependent third party services will fail with errors related to being unable to access the resources.

Travis-CI
Issues related to the S3 outage in AWS.

Mar 1, 00:43 UTC Monitoring - We are currently processing a large job backlog as part of fallout from the S3 incident. All services are functioning normally, but you may still notice delays until the backlogs clear.Feb 28, 23:14 UTC Update - We are seeing some services recover after the S3 outage, and jobs are processing. We are watching recovery closely, and will keep updating if anything goes awry.Feb 28, 21:11 UTC Update - We are still waiting for Amazon S3 to recover.Feb 28, 18:10 UTC Identified - AWS has confirmed that S3 is experiencing issues. We've taken some actions to maintain current container-based capacity and are monitoring the S3 status and overall healthy of our infrastructure closely.Feb 28, 18:00 UTC Investigating - We are investigating issues related to the S3 outage in AWS. Currently build logs older than a few hours will fail to load. Build caches for container-based builds are unavailable. Builds that depend on resources like Docker Hub, Quay.io, or other S3 dependent third party services will fail with errors related to being unable to access the resources.

Travis-CI
Issues related to the S3 outage in AWS.

Feb 28, 23:14 UTC Update - We are seeing some services recover after the S3 outage, and jobs are processing. We are watching recovery closely, and will keep updating if anything goes awry.Feb 28, 21:11 UTC Update - We are still waiting for Amazon S3 to recover.Feb 28, 18:10 UTC Identified - AWS has confirmed that S3 is experiencing issues. We've taken some actions to maintain current container-based capacity and are monitoring the S3 status and overall healthy of our infrastructure closely.Feb 28, 18:00 UTC Investigating - We are investigating issues related to the S3 outage in AWS. Currently build logs older than a few hours will fail to load. Build caches for container-based builds are unavailable. Builds that depend on resources like Docker Hub, Quay.io, or other S3 dependent third party services will fail with errors related to being unable to access the resources.

Travis-CI
Issues related to the S3 outage in AWS.

Feb 28, 21:11 UTC Update - We are still waiting for Amazon S3 to recover.Feb 28, 18:10 UTC Identified - AWS has confirmed that S3 is experiencing issues. We've taken some actions to maintain current container-based capacity and are monitoring the S3 status and overall healthy of our infrastructure closely.Feb 28, 18:00 UTC Investigating - We are investigating issues related to the S3 outage in AWS. Currently build logs older than a few hours will fail to load. Build caches for container-based builds are unavailable. Builds that depend on resources like Docker Hub, Quay.io, or other S3 dependent third party services will fail with errors related to being unable to access the resources.

Travis-CI
Issues related to the S3 outage in AWS.

Feb 28, 18:10 UTC Identified - AWS has confirmed that S3 is experiencing issues. We've taken some actions to maintain current container-based capacity and are monitoring the S3 status and overall healthy of our infrastructure closely.Feb 28, 18:00 UTC Investigating - We are investigating issues related to the S3 outage in AWS. Currently build logs older than a few hours will fail to load. Build caches for container-based builds are unavailable. Builds that depend on resources like Docker Hub, Quay.io, or other S3 dependent third party services will fail with errors related to being unable to access the resources.

Travis-CI
Issues related to the S3 outage in AWS.

Feb 28, 18:00 UTC Investigating - We are investigating issues related to the S3 outage in AWS. Currently build logs older than a few hours will fail to load. Build caches for container-based builds are unavailable. Builds that depend on resources like Docker Hub, Quay.io, or other S3 dependent third party services will fail with errors related to being unable to access the resources.

Travis-CI
Trusty Linux builds fail with `apt-get update` and 404

Feb 18, 05:39 UTC Resolved - A fix has been deployed for `language: csharp` builds. Please let us know if it still doesn't work for you. We would be happy to have a look. Thank you for your patience and happy building!Feb 17, 23:11 UTC Update - Most builds are now fixed. Only builds `language: csharp` remain. The fix should be released to production soon. Thank you for your patience.Feb 17, 15:39 UTC Identified - We are currently working on a resolution for issues with Trusty Linux builds failing with `apt-get update` and 404. This failure is due to an upstream PPA layout change. We'll be posting updates to https://github.com/travis-ci/travis-ci/issues/7332

Travis-CI
Trusty Linux builds fail with `apt-get update` and 404

Feb 17, 23:11 UTC Update - Most builds are now fixed. Only builds `language: csharp` remain. The fix should be released to production soon. Thank you for your patience.Feb 17, 15:39 UTC Identified - We are currently working on a resolution for issues with Trusty Linux builds failing with `apt-get update` and 404. This failure is due to an upstream PPA layout change. We'll be posting updates to https://github.com/travis-ci/travis-ci/issues/7332

Travis-CI
Emergency maintenance for .com logs database

Feb 17, 17:46 UTC Resolved - Maintenance work and verification have been completed.Feb 17, 15:22 UTC Monitoring - At this time we've completed verification and the maintenance work is completed. We'll continue to monitor things closely for another 60 minutes, then resolve the issue if nothing is wrong.Feb 17, 15:15 UTC Update - We are continuing to do verification at this time. New logs should be visible in the web UI and cli. Recently finished job logs may be delayed.Feb 17, 15:03 UTC Update - The database changes are completed. We are beginning the process of testing things and verifications to see if we're ready to exit maintenance mode. We'll provide another update in 15 minutes.Feb 17, 14:47 UTC Update - This incident originally mentions the .org logs database, but this maintenance is for the .com logs databases. Apologies for any confusion.Feb 17, 14:38 UTC Identified - We've identified that we need to take emergency maintenance for the .com logs database in order to increase the available resources of this database and ensure overall stability of the logs system. Expected User Impact: Delays in: - Seeing log updates in the web interface - Delays in viewing logs for recently finished jobs. Our initial expectation is for approximately. 60 minutes in this state. We'll provide an update at least 15 minutes after we begin the needed changes.

Travis-CI
Trusty Linux builds fail with `apt-get update` and 404

Feb 17, 15:39 UTC Identified - We are currently working on a resolution for issues with Trusty Linux builds failing with `apt-get update` and 404. This failure is due to an upstream PPA layout change. We'll be posting updates to https://github.com/travis-ci/travis-ci/issues/7332

Travis-CI
Trusty Linux builds fail with `apt-get update` and 404

Feb 17, 15:39 UTC Identified - We are currently working on a resolution for issues with Trusty Linux builds failing with `apt-get update` and 404. This failure is due to an upstream PPA layout change. We'll be posting updates to https://github.com/travis-ci/travis-ci/issues/7332

Travis-CI
Emergency maintenance for .com logs database

Feb 17, 15:22 UTC Monitoring - At this time we've completed verification and the maintenance work is completed. We'll continue to monitor things closely for another 60 minutes, then resolve the issue if nothing is wrong.Feb 17, 15:15 UTC Update - We are continuing to do verification at this time. New logs should be visible in the web UI and cli. Recently finished job logs may be delayed.Feb 17, 15:03 UTC Update - The database changes are completed. We are beginning the process of testing things and verifications to see if we're ready to exit maintenance mode. We'll provide another update in 15 minutes.Feb 17, 14:47 UTC Update - This incident originally mentions the .org logs database, but this maintenance is for the .com logs databases. Apologies for any confusion.Feb 17, 14:38 UTC Identified - We've identified that we need to take emergency maintenance for the .com logs database in order to increase the available resources of this database and ensure overall stability of the logs system. Expected User Impact: Delays in: - Seeing log updates in the web interface - Delays in viewing logs for recently finished jobs. Our initial expectation is for approximately. 60 minutes in this state. We'll provide an update at least 15 minutes after we begin the needed changes.

Travis-CI
Emergency maintenance for .com logs database

Feb 17, 15:15 UTC Update - We are continuing to do verification at this time. New logs should be visible in the web UI and cli. Recently finished job logs may be delayed.Feb 17, 15:03 UTC Update - The database changes are completed. We are beginning the process of testing things and verifications to see if we're ready to exit maintenance mode. We'll provide another update in 15 minutes.Feb 17, 14:47 UTC Update - This incident originally mentions the .org logs database, but this maintenance is for the .com logs databases. Apologies for any confusion.Feb 17, 14:38 UTC Identified - We've identified that we need to take emergency maintenance for the .com logs database in order to increase the available resources of this database and ensure overall stability of the logs system. Expected User Impact: Delays in: - Seeing log updates in the web interface - Delays in viewing logs for recently finished jobs. Our initial expectation is for approximately. 60 minutes in this state. We'll provide an update at least 15 minutes after we begin the needed changes.

Travis-CI
Emergency maintenance for .com logs database

Feb 17, 15:03 UTC Update - The database changes are completed. We are beginning the process of testing things and verifications to see if we're ready to exit maintenance mode. We'll provide another update in 15 minutes.Feb 17, 14:47 UTC Update - This incident originally mentions the .org logs database, but this maintenance is for the .com logs databases. Apologies for any confusion.Feb 17, 14:38 UTC Identified - We've identified that we need to take emergency maintenance for the .com logs database in order to increase the available resources of this database and ensure overall stability of the logs system. Expected User Impact: Delays in: - Seeing log updates in the web interface - Delays in viewing logs for recently finished jobs. Our initial expectation is for approximately. 60 minutes in this state. We'll provide an update at least 15 minutes after we begin the needed changes.

Travis-CI
Emergency maintenance for .com logs database

Feb 17, 14:47 UTC Update - This incident originally mentions the .org logs database, but this maintenance is for the .com logs databases. Apologies for any confusion.Feb 17, 14:38 UTC Identified - We've identified that we need to take emergency maintenance for the .com logs database in order to increase the available resources of this database and ensure overall stability of the logs system. Expected User Impact: Delays in: - Seeing log updates in the web interface - Delays in viewing logs for recently finished jobs. Our initial expectation is for approximately. 60 minutes in this state. We'll provide an update at least 15 minutes after we begin the needed changes.

Travis-CI
Emergency maintenance for .org logs database

Feb 17, 14:38 UTC Identified - We've identified that we need to take emergency maintenance for the .org logs database in order to increase the available resources of this database and ensure overall stability of the logs system. Expected User Impact: Delays in: - Seeing log updates in the web interface - Delays in viewing logs for recently finished jobs. Our initial expectation is for approximately. 60 minutes in this state. We'll provide an update at least 15 minutes after we begin the needed changes.

Travis-CI
Brief delays in log processing

Feb 16, 16:42 UTC Resolved - Processing of build logs has been delayed by a short period of time over the last 30 minutes. The queues have now cleared, and logs should be processing as expected.

Travis-CI
Build log delay on travis-ci.com

Feb 10, 16:07 UTC Resolved - Log processing has recovered and logs are showing normally again.Feb 10, 15:58 UTC Investigating - We're investigating delays in build logs for travis-ci.com. Builds should be marked as finished on time, but the logs may take longer than normal to show up.

Travis-CI
Build log delay on travis-ci.com

Feb 10, 15:58 UTC Investigating - We're investigating delays in build logs for travis-ci.com. Builds should be marked as finished on time, but the logs may take longer than normal to show up.

Travis-CI
Delays for new builds on travis-ci.com and travis-ci.org

Feb 10, 10:12 UTC Resolved - Incoming builds are now processing normally again.Feb 10, 10:05 UTC Update - We're seeing similar delays for travis-ci.org.Feb 10, 09:52 UTC Investigating - We're investigating delays in processing incoming builds for travis-ci.com. New commits may take longer than normal to appear on Travis CI, but existing builds should run normally.

Travis-CI
Delays for new builds on travis-ci.com and travis-ci.org

Feb 10, 10:05 UTC Update - We're seeing similar delays for travis-ci.org.Feb 10, 09:52 UTC Investigating - We're investigating delays in processing incoming builds for travis-ci.com. New commits may take longer than normal to appear on Travis CI, but existing builds should run normally.

Travis-CI
Delays for new builds on travis-ci.com

Feb 10, 09:52 UTC Investigating - We're investigating delays in processing incoming builds for travis-ci.com. New commits may take longer than normal to appear on Travis CI, but existing builds should run normally.

Travis-CI
Mac Builds Network Outage

Feb 9, 19:21 UTC Resolved - All MacOS builds have resumed and processing at full capacity. Seemed to have been an upstream provider hiccup. The backlog should be clearing momentarily for .com users.Feb 9, 19:07 UTC Investigating - We are currently investigating a network outage on our macOS infrastructure.

Travis-CI
Mac Builds Network Outage

Feb 9, 19:07 UTC Investigating - We are currently investigating a network outage on our macOS infrastructure.

Travis-CI
Log Processing Delay on travis-ci.com

Feb 8, 23:44 UTC Resolved - The backlog has drained and logs are being processed normally now. Thank you for your patience!Feb 8, 23:05 UTC Monitoring - We are currently experiencing delays while processing logs due to a larger than normal backlog.

Travis-CI
Log Processing Delay on travis-ci.com

Feb 8, 23:05 UTC Monitoring - We are currently experiencing delays while processing logs due to a larger than normal backlog.

Travis-CI
Delays with GitHub syncs on travis-ci.com

Feb 7, 11:38 UTC Resolved - GitHub syncs are now running normally. We had to clear out one of the queues to prevent a component from running out of memory, which means that some syncs that had been scheduled got cancelled. If you've added some new repositories in the past day or two that haven't showed up on Travis CI yet, you can click the "Sync account" button on https://travis-ci.com/profile to schedule a new syncFeb 7, 10:34 UTC Update - We're continuing to investigate delays in GitHub syncs. At the moment, only the automatic daily syncs are delayed, manual syncs are not delayed.Feb 7, 09:53 UTC Investigating - We're investigating delays with synchronizing user accounts with GitHub on travis-ci.com. New repositories may take longer than normal to show up.

Travis-CI
Delays with GitHub syncs on travis-ci.com

Feb 7, 10:34 UTC Update - We're continuing to investigate delays in GitHub syncs. At the moment, only the automatic daily syncs are delayed, manual syncs are not delayed.Feb 7, 09:53 UTC Investigating - We're investigating delays with synchronizing user accounts with GitHub on travis-ci.com. New repositories may take longer than normal to show up.

Travis-CI
Delays with GitHub syncs on travis-ci.com

Feb 7, 09:53 UTC Investigating - We're investigating delays with synchronizing user accounts with GitHub on travis-ci.com. New repositories may take longer than normal to show up.

Travis-CI
Container-based Linux Precise infrastructure emergency maintenance

Feb 6, 00:31 UTC Resolved - This incident has been resolved.Feb 6, 00:16 UTC Monitoring - The rollout is nearing completion with capacity surpassing demand. We don't expect any build start delays for the remainder of the maintenance.Feb 5, 21:56 UTC Identified - The container-based Linux infrastructure requires emergency maintenance to ensure all instances are running a known working version of the "worker" component. We had started rolling out a newer version this past Thursday, then began rolling back due to reports of mismatched exit code and job status. Please expect partial capacity behavior similar to times of high load during the maintenance, which we expect will take between 1-2 hours.

Travis-CI
Container-based Linux Precise infrastructure emergency maintenance

Feb 6, 00:16 UTC Monitoring - The rollout is nearing completion with capacity surpassing demand. We don't expect any build start delays for the remainder of the maintenance.Feb 5, 21:56 UTC Identified - The container-based Linux infrastructure requires emergency maintenance to ensure all instances are running a known working version of the "worker" component. We had started rolling out a newer version this past Thursday, then began rolling back due to reports of mismatched exit code and job status. Please expect partial capacity behavior similar to times of high load during the maintenance, which we expect will take between 1-2 hours.

Travis-CI
Container-based Linux Precise infrastructure emergency maintenance

Feb 5, 21:56 UTC Identified - The container-based Linux infrastructure requires emergency maintenance to ensure all instances are running a known working version of the "worker" component. We had started rolling out a newer version this past Thursday, then began rolling back due to reports of mismatched exit code and job status. Please expect partial capacity behavior similar to times of high load during the maintenance, which we expect will take between 1-2 hours.

Travis-CI
Container-based Linux infrastructure emergency maintenance

Feb 5, 21:56 UTC Identified - The container-based Linux infrastructure requires emergency maintenance to ensure all instances are running a known working version of the "worker" component. We had started rolling out a newer version this past Thursday, then began rolling back due to reports of mismatched exit code and job status. Please expect partial capacity behavior similar to times of high load during the maintenance, which we expect will take between 1-2 hours.

Travis-CI
Network switch on Mac infrastructure

Feb 3, 17:57 UTC Completed - We have completed the network switch and restored mac builds to full capacity!Feb 3, 16:56 UTC Update - We are increasing the RAM on our pfSense boxes and will perform a failover. Network connections may experience hiccups for a short period of time.Feb 3, 16:22 UTC Verifying - We have completed the switch and are monitoring closely.Feb 3, 16:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Feb 3, 15:52 UTC Scheduled - In response to the major outage that we experienced this week (https://www.traviscistatus.com/incidents/k79mjcv403c4), we are performing a switch in our networking setup. In mitigating the issue we bypassed our pfSense boxes and instead let our Cisco ASA handle DHCP. However, this limits the IP pool to 256 IPs, which means we were required to reduce our capacity. We have now rebuilt the pfSense boxes and are bringing them back online. This will allow us to restore our service to the full capacity. While we are taking precautions to mitigate any issues during the switch, we may experience some small service disruptions on MacOS builds.

Travis-CI
Network switch on Mac infrastructure

Feb 3, 16:56 UTC Update - We are increasing the RAM on our pfSense boxes and will perform a failover. Network connections may experience hiccups for a short period of time.Feb 3, 16:22 UTC Verifying - We have completed the switch and are monitoring closely.Feb 3, 16:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Feb 3, 15:52 UTC Scheduled - In response to the major outage that we experienced this week (https://www.traviscistatus.com/incidents/k79mjcv403c4), we are performing a switch in our networking setup. In mitigating the issue we bypassed our pfSense boxes and instead let our Cisco ASA handle DHCP. However, this limits the IP pool to 256 IPs, which means we were required to reduce our capacity. We have now rebuilt the pfSense boxes and are bringing them back online. This will allow us to restore our service to the full capacity. While we are taking precautions to mitigate any issues during the switch, we may experience some small service disruptions on MacOS builds.

Travis-CI
Network switch on Mac infrastructure

Feb 3, 16:22 UTC Verifying - We have completed the switch and are monitoring closely.Feb 3, 16:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Feb 3, 15:52 UTC Scheduled - In response to the major outage that we experienced this week (https://www.traviscistatus.com/incidents/k79mjcv403c4), we are performing a switch in our networking setup. In mitigating the issue we bypassed our pfSense boxes and instead let our Cisco ASA handle DHCP. However, this limits the IP pool to 256 IPs, which means we were required to reduce our capacity. We have now rebuilt the pfSense boxes and are bringing them back online. This will allow us to restore our service to the full capacity. While we are taking precautions to mitigate any issues during the switch, we may experience some small service disruptions on MacOS builds.

Travis-CI
Network switch on Mac infrastructure

Feb 3, 16:00 UTC In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary.Feb 3, 15:52 UTC Scheduled - In response to the major outage that we experienced this week (https://www.traviscistatus.com/incidents/k79mjcv403c4), we are performing a switch in our networking setup. In mitigating the issue we bypassed our pfSense boxes and instead let our Cisco ASA handle DHCP. However, this limits the IP pool to 256 IPs, which means we were required to reduce our capacity. We have now rebuilt the pfSense boxes and are bringing them back online. This will allow us to restore our service to the full capacity. While we are taking precautions to mitigate any issues during the switch, we may experience some small service disruptions on MacOS builds.

Travis-CI
Network switch on Mac infrastructure

Feb 3, 15:52 UTC Scheduled - In response to the major outage that we experienced this week (https://www.traviscistatus.com/incidents/k79mjcv403c4), we are performing a switch in our networking setup. In mitigating the issue we bypassed our pfSense boxes and instead let our Cisco ASA handle DHCP. However, this limits the IP pool to 256 IPs, which means we were required to reduce our capacity. We have now rebuilt the pfSense boxes and are bringing them back online. This will allow us to restore our service to the full capacity. While we are taking precautions to mitigate any issues during the switch, we may experience some small service disruptions on MacOS builds.

Travis-CI
MacOS queue backup & emergency maintenance

Feb 2, 01:35 UTC Resolved - The private repo backlog is clear. The public repo backlog continues to drop, which is typical for this day/hour. Thanks again for waiting! 👋❤️Feb 2, 01:18 UTC Update - The backlog for private repos is still dropping; now below 50. Thank you again for your patience!Feb 2, 00:13 UTC Update - The backlog for private repos is still dropping; now below 150. We will update again in an hour. Thank you for your patience! 💖Feb 1, 22:49 UTC Update - We're seeing the backlog level off during peak usage hours. We will continue to issue updates as we monitor backlog progress.Feb 1, 21:36 UTC Update - The private repo backlog has dropped steadily over the past hour, and we expect it will be caught up in less than 90 minutes. Thank you again for your patience!Feb 1, 20:38 UTC Update - Our Mac infrastructure is processing builds normally for both travis-ci.org and travis-ci.com albeit at a reduced capacity. We are working on fixing our DHCP issues to be able to restore the full capacity. We cannot thank you enough for your enduring patience.Feb 1, 19:33 UTC Update - We have increased capacity in production for both public and private repos. Due to ongoing issues with our DHCP setup, we have limited the cap to less than full capacity.Feb 1, 19:07 UTC Monitoring - We are now running at reduced job processing capacity in production for both public and private repos.Feb 1, 18:22 UTC Update - The patches we're testing need additional work. We expect production job capacity to come online in the next hour. Thank you for your patience through these multiple delays.Feb 1, 17:47 UTC Update - We are in the process of testing further patches to skip jobs older than 6 hours in order to help with the massive backlog. We expect to see jobs flowing again in production within the next 30 minutes.Feb 1, 16:45 UTC Update - We are on the verge of resuming to process Mac builds on travis-ci.org. Thank you for hanging in there with us.Feb 1, 16:09 UTC Update - We’ve begun performing the necessary networking changes and will begin testing them as soon as they’re completed. We appreciate your continued patience.Feb 1, 15:27 UTC Update - We have proceeded with limiting the maximum number of concurrent jobs on open source repositories with jobs on our Mac infrastructure. You can find more details about this setting here: https://docs.travis-ci.com/user/customizing-the-build#sts=Limiting-Concurrent-Builds. This change will help with the throughput of your Linux builds on other repositories while we are getting our Mac infrastructure back up. We will revert this change once things settle. Thank you for your understanding.Feb 1, 14:33 UTC Update - We are continuing to work on fixing the connectivity issue preventing us restarting Mac builds processing on both travis-ci.com and travis-ci.org. Meanwhile, we are also working on putting stopgap measures via our software platform to prevent disruption of our Linux builds throughput. Thank you for your enduring patience.Feb 1, 14:07 UTC Update - We made the difficult decision to proceed with cancelling all pending Mac builds on travis-ci.org. Doing so should improve Linux builds throughput and it will hopefully help us get the Mac infrastructure back on its feet. We are sorry for this drastic measure.Feb 1, 11:03 UTC Update - We’re still attempting to resolve the connectivity issues. We appreciate your ongoing patience.Feb 1, 09:30 UTC Update - We’ve identified connectivity issues in our MacOS workers and we’re stopping all Mac builds to further investigate and fix them.Feb 1, 07:11 UTC Update - Restarting the platform did not resolve all issues, and we are continuing to dig into the sources of instability.Feb 1, 06:45 UTC Update - The virtualization platform has been fully restarted and we're now bringing job processing capacity back online.Feb 1, 04:51 UTC Update - The underlying VM infrastructure is still unstable, so we are coordinating with our infrastructure provider to perform a full restart. We will update again once we resume job processing.Feb 1, 02:52 UTC Identified - Some misbehaving hosts have been restarted thanks to help from our upstream provider. We are bringing job processing capacity back online.Feb 1, 02:27 UTC Update - We are stopping all job throughput to prevent runaway VM leakage while waiting for further insights from our upstream infrastructure provider.Feb 1, 01:58 UTC Investigating - MacOS queues for both public and private repos are backed up. We are working with our Mac infrastructure provider to identify contributing factors.

Travis-CI
MacOS queue backup & emergency maintenance

Feb 2, 01:18 UTC Update - The backlog for private repos is still dropping; now below 50. Thank you again for your patience!Feb 2, 00:13 UTC Update - The backlog for private repos is still dropping; now below 150. We will update again in an hour. Thank you for your patience! 💖Feb 1, 22:49 UTC Update - We're seeing the backlog level off during peak usage hours. We will continue to issue updates as we monitor backlog progress.Feb 1, 21:36 UTC Update - The private repo backlog has dropped steadily over the past hour, and we expect it will be caught up in less than 90 minutes. Thank you again for your patience!Feb 1, 20:38 UTC Update - Our Mac infrastructure is processing builds normally for both travis-ci.org and travis-ci.com albeit at a reduced capacity. We are working on fixing our DHCP issues to be able to restore the full capacity. We cannot thank you enough for your enduring patience.Feb 1, 19:33 UTC Update - We have increased capacity in production for both public and private repos. Due to ongoing issues with our DHCP setup, we have limited the cap to less than full capacity.Feb 1, 19:07 UTC Monitoring - We are now running at reduced job processing capacity in production for both public and private repos.Feb 1, 18:22 UTC Update - The patches we're testing need additional work. We expect production job capacity to come online in the next hour. Thank you for your patience through these multiple delays.Feb 1, 17:47 UTC Update - We are in the process of testing further patches to skip jobs older than 6 hours in order to help with the massive backlog. We expect to see jobs flowing again in production within the next 30 minutes.Feb 1, 16:45 UTC Update - We are on the verge of resuming to process Mac builds on travis-ci.org. Thank you for hanging in there with us.Feb 1, 16:09 UTC Update - We’ve begun performing the necessary networking changes and will begin testing them as soon as they’re completed. We appreciate your continued patience.Feb 1, 15:27 UTC Update - We have proceeded with limiting the maximum number of concurrent jobs on open source repositories with jobs on our Mac infrastructure. You can find more details about this setting here: https://docs.travis-ci.com/user/customizing-the-build#sts=Limiting-Concurrent-Builds. This change will help with the throughput of your Linux builds on other repositories while we are getting our Mac infrastructure back up. We will revert this change once things settle. Thank you for your understanding.Feb 1, 14:33 UTC Update - We are continuing to work on fixing the connectivity issue preventing us restarting Mac builds processing on both travis-ci.com and travis-ci.org. Meanwhile, we are also working on putting stopgap measures via our software platform to prevent disruption of our Linux builds throughput. Thank you for your enduring patience.Feb 1, 14:07 UTC Update - We made the difficult decision to proceed with cancelling all pending Mac builds on travis-ci.org. Doing so should improve Linux builds throughput and it will hopefully help us get the Mac infrastructure back on its feet. We are sorry for this drastic measure.Feb 1, 11:03 UTC Update - We’re still attempting to resolve the connectivity issues. We appreciate your ongoing patience.Feb 1, 09:30 UTC Update - We’ve identified connectivity issues in our MacOS workers and we’re stopping all Mac builds to further investigate and fix them.Feb 1, 07:11 UTC Update - Restarting the platform did not resolve all issues, and we are continuing to dig into the sources of instability.Feb 1, 06:45 UTC Update - The virtualization platform has been fully restarted and we're now bringing job processing capacity back online.Feb 1, 04:51 UTC Update - The underlying VM infrastructure is still unstable, so we are coordinating with our infrastructure provider to perform a full restart. We will update again once we resume job processing.Feb 1, 02:52 UTC Identified - Some misbehaving hosts have been restarted thanks to help from our upstream provider. We are bringing job processing capacity back online.Feb 1, 02:27 UTC Update - We are stopping all job throughput to prevent runaway VM leakage while waiting for further insights from our upstream infrastructure provider.Feb 1, 01:58 UTC Investigating - MacOS queues for both public and private repos are backed up. We are working with our Mac infrastructure provider to identify contributing factors.

Travis-CI
MacOS queue backup & emergency maintenance

Feb 2, 00:13 UTC Update - The backlog for private repos is still dropping; now below 150. We will update again in an hour. Thank you for your patience! 💖Feb 1, 22:49 UTC Update - We're seeing the backlog level off during peak usage hours. We will continue to issue updates as we monitor backlog progress.Feb 1, 21:36 UTC Update - The private repo backlog has dropped steadily over the past hour, and we expect it will be caught up in less than 90 minutes. Thank you again for your patience!Feb 1, 20:38 UTC Update - Our Mac infrastructure is processing builds normally for both travis-ci.org and travis-ci.com albeit at a reduced capacity. We are working on fixing our DHCP issues to be able to restore the full capacity. We cannot thank you enough for your enduring patience.Feb 1, 19:33 UTC Update - We have increased capacity in production for both public and private repos. Due to ongoing issues with our DHCP setup, we have limited the cap to less than full capacity.Feb 1, 19:07 UTC Monitoring - We are now running at reduced job processing capacity in production for both public and private repos.Feb 1, 18:22 UTC Update - The patches we're testing need additional work. We expect production job capacity to come online in the next hour. Thank you for your patience through these multiple delays.Feb 1, 17:47 UTC Update - We are in the process of testing further patches to skip jobs older than 6 hours in order to help with the massive backlog. We expect to see jobs flowing again in production within the next 30 minutes.Feb 1, 16:45 UTC Update - We are on the verge of resuming to process Mac builds on travis-ci.org. Thank you for hanging in there with us.Feb 1, 16:09 UTC Update - We’ve begun performing the necessary networking changes and will begin testing them as soon as they’re completed. We appreciate your continued patience.Feb 1, 15:27 UTC Update - We have proceeded with limiting the maximum number of concurrent jobs on open source repositories with jobs on our Mac infrastructure. You can find more details about this setting here: https://docs.travis-ci.com/user/customizing-the-build#sts=Limiting-Concurrent-Builds. This change will help with the throughput of your Linux builds on other repositories while we are getting our Mac infrastructure back up. We will revert this change once things settle. Thank you for your understanding.Feb 1, 14:33 UTC Update - We are continuing to work on fixing the connectivity issue preventing us restarting Mac builds processing on both travis-ci.com and travis-ci.org. Meanwhile, we are also working on putting stopgap measures via our software platform to prevent disruption of our Linux builds throughput. Thank you for your enduring patience.Feb 1, 14:07 UTC Update - We made the difficult decision to proceed with cancelling all pending Mac builds on travis-ci.org. Doing so should improve Linux builds throughput and it will hopefully help us get the Mac infrastructure back on its feet. We are sorry for this drastic measure.Feb 1, 11:03 UTC Update - We’re still attempting to resolve the connectivity issues. We appreciate your ongoing patience.Feb 1, 09:30 UTC Update - We’ve identified connectivity issues in our MacOS workers and we’re stopping all Mac builds to further investigate and fix them.Feb 1, 07:11 UTC Update - Restarting the platform did not resolve all issues, and we are continuing to dig into the sources of instability.Feb 1, 06:45 UTC Update - The virtualization platform has been fully restarted and we're now bringing job processing capacity back online.Feb 1, 04:51 UTC Update - The underlying VM infrastructure is still unstable, so we are coordinating with our infrastructure provider to perform a full restart. We will update again once we resume job processing.Feb 1, 02:52 UTC Identified - Some misbehaving hosts have been restarted thanks to help from our upstream provider. We are bringing job processing capacity back online.Feb 1, 02:27 UTC Update - We are stopping all job throughput to prevent runaway VM leakage while waiting for further insights from our upstream infrastructure provider.Feb 1, 01:58 UTC Investigating - MacOS queues for both public and private repos are backed up. We are working with our Mac infrastructure provider to identify contributing factors.

Travis-CI
MacOS queue backup & emergency maintenance

Feb 1, 22:49 UTC Update - We're seeing the backlog level off during peak usage hours. We will continue to issue updates as we monitor backlog progress.Feb 1, 21:36 UTC Update - The private repo backlog has dropped steadily over the past hour, and we expect it will be caught up in less than 90 minutes. Thank you again for your patience!Feb 1, 20:38 UTC Update - Our Mac infrastructure is processing builds normally for both travis-ci.org and travis-ci.com albeit at a reduced capacity. We are working on fixing our DHCP issues to be able to restore the full capacity. We cannot thank you enough for your enduring patience.Feb 1, 19:33 UTC Update - We have increased capacity in production for both public and private repos. Due to ongoing issues with our DHCP setup, we have limited the cap to less than full capacity.Feb 1, 19:07 UTC Monitoring - We are now running at reduced job processing capacity in production for both public