This week on The PeopleSoft Administrator Podcast, we talk about Fluid Navigation. Dan shares his dislikes (and likes) of the default navigation style in PeopleTools 8.55. We also talk about new PeopleTools 8.55 features like Log Analyzer, CORS support and Log Correlation.
We want to make this podcast part of the community discussion on PeopleSoft administration. If you have comments, feedback, or topics you’d like us to talk about, we want to hear from you! You can email us at firstname.lastname@example.org, tweet us at @psa_io, or use the Twitter hashtag #psadminpodcast.
You can listen to the podcast here on psadmin.io or subscribe with your favorite podcast player using the URL below, or subscribe in iTunes.
- Chrome Extensions
- PSChrome Extension
- PSUtilities Extension
- SES Patching (1632712.1)
- Health Center Overview
- VERSION Run Controls
- Log Analyzer
- Problem Step Recorder
Pingback: Podcast #11 – Fluid Navigation | Dinesh Ram Kali.
In this Podcast, LogStash was mentioned as being used. Are you using the ELK stack/Elastic stack for the further analysis? Can you share your experience on building the filters for log stash?
I am running a separate ELK stack for collecting our logs. All of our web access logs and APPSRV logs are sent there via Filebeat -> Logstash -> Elasticsearch.
Writing the Logstash filters was the most intensive part of the build. We aren’t completely done yet, as there is always more data we want to enrich with Logstash. My suggestion to get started with Logstash is to find a few sample log file lines and build your filters around those. Grok Debugger is a great place to test your Logstash filters.
Dan, Thank you for your reply! We just started the installation process of Elastic Stack. So are you using HTTP access logs for finding the usage of the application?
Yes, heavily. If you are going to Alliance or Collaborate I’m giving talks about our setup. I’m working on something for the site too, but that won’t be ready for a while.
That is awesome! I won’t able to attend either conference due to PT upgrade project. After looking at the stack found logstash as crucial to parse the logs, can you share your experience on how complex it is and can you share any tips? I have updated my http logging with these parameters “date time c-ip cs-method cs-uri sc-status bytes time-taken cs(User-Agent)”
That’s a good start. In Logstash, the grok filter is a great way to break down the log fields and structure the log data. There is a heroku app that really helps when building grok filters and lets you test: http://grokdebug.herokuapp.com