Enable logging
Enable logs for Solo Enterprise for Istio components so that you can review them in the Gloo UI.
Enabling logs in the Gloo UI requires a Solo Enterprise for Istio license. Contact your account representative to obtain a license.
Single cluster
Get your current installation Helm values, and save them in a file.
helm get values gloo-platform -n gloo-mesh -o yaml > gloo-single.yaml open gloo-single.yamlAdd the following configuration to your Helm values file to enable logs for Solo Enterprise for Istio components. Enabling the management server is required to gather logs for the Solo Enterprise for Istio components, but the management server does not affect other aspects of your setup.
glooMgmtServer: enabled: true telemetryCollectorCustomization: pipelines: logs/ui: enabled: trueUpgrade your installation by using your updated values file.
helm upgrade gloo-platform gloo-platform/gloo-platform \ --namespace gloo-mesh \ --values gloo-single.yaml --version ${MGMT_VERSION}Verify that your settings were added to the telemetry collector configmap.
kubectl get configmap gloo-telemetry-collector-config -n gloo-mesh -o yamlPerform a rollout restart of the telemetry collector daemon set to force your configmap changes to be applied to the telemetry collector agent pod.
kubectl rollout restart -n gloo-mesh daemonset/gloo-telemetry-collector-agentOpen the Gloo UI. The Gloo UI is served from the
gloo-mesh-uiservice on port 8090. You can connect by using themeshctlorkubectlCLIs.- meshctl: For more information, see the CLI documentation.
meshctl dashboard - kubectl:
- Port-forward the
gloo-mesh-uiservice on 8090.kubectl port-forward -n gloo-mesh svc/gloo-mesh-ui 8090:8090 - Open your browser and connect to http://localhost:8090.
- Port-forward the
- meshctl: For more information, see the CLI documentation.
In the navigation pane, click Logs. Select the cluster, component, pod, and container for which you want to see the logs.
Multicluster
Get the Helm values files for your current version.
- Get your current values for the management plane.
helm get values gloo-platform -n gloo-mesh -o yaml --kube-context ${context1} > mgmt-plane.yaml open mgmt-plane.yaml - Get your current values for the data plane.
helm get values gloo-platform -n gloo-mesh -o yaml --kube-context ${context2} > data-plane.yaml open data-plane.yaml
- Get your current values for the management plane.
In the Helm values for the management plane, add the following configuration to enable logs for Solo Enterprise for Istio components.
telemetryCollectorCustomization: pipelines: logs/ui: enabled: trueIn the Helm values file for the data plane, add the following configuration to enable logs for Solo Enterprise for Istio components. Logs are automatically sent to the telemetry gateway in the management cluster.
telemetryCollectorCustomization: pipelines: logs/ui: enabled: trueUpgrade the management plane release.
helm upgrade gloo-platform gloo-platform/gloo-platform \ --kube-context ${context1} \ --namespace gloo-mesh \ -f mgmt-plane.yaml \ --version ${MGMT_VERSION}Verify that your settings are applied in the management cluster.
Verify that your settings were added to the telemetry gateway configmap.
kubectl get configmap gloo-telemetry-gateway-config -n gloo-mesh -o yaml --context ${context1}Perform a rollout restart of the telemetry gateway deployment to force your configmap changes to be applied to the telemetry gateway pod.
kubectl rollout restart -n gloo-mesh deployment/gloo-telemetry-gateway --context ${context1}
Upgrade the data plane release.
helm upgrade gloo-platform gloo-platform/gloo-platform \ --kube-context ${context2} \ --namespace gloo-mesh \ -f data-plane.yaml \ --version ${MGMT_VERSION}Verify that your settings are applied in the workload cluster.
Verify that your settings were added to the telemetry collector configmap.
kubectl get configmap gloo-telemetry-collector-config -n gloo-mesh -o yaml --context ${context2}Perform a rollout restart of the telemetry collector daemon set to force your configmap changes to be applied to the telemetry collector agent pods.
kubectl rollout restart -n gloo-mesh daemonset/gloo-telemetry-collector-agent --context ${context2}
Open the Gloo UI. The Gloo UI is served from the
gloo-mesh-uiservice on port 8090 in the cluster where the management plane is deployed. You can connect by using themeshctlorkubectlCLIs.- meshctl: For more information, see the CLI documentation.
meshctl dashboard --kube-context ${context1} - kubectl:
- Port-forward the
gloo-mesh-uiservice on 8090.kubectl port-forward -n gloo-mesh --context ${context1} svc/gloo-mesh-ui 8090:8090 - Open your browser and connect to http://localhost:8090.
- Port-forward the
- meshctl: For more information, see the CLI documentation.
In the navigation pane, click Logs. Select the cluster, component, pod, and container for which you want to see the logs.